Become an expert in R — Interactive courses, Cheat Sheets, certificates and more!
Get Started for Free

plot_cross_validation_metric

Plot a performance metric vs. forecast horizon from cross validation. Cross validation produces a collection of out-of-sample model predictions that can be compared to actual values, at a range of different horizons (distance from the cutoff). This computes a specified performance metric for each prediction, and aggregated over a rolling window with horizon.


Description

This uses fbprophet.diagnostics.performance_metrics to compute the metrics. Valid values of metric are 'mse', 'rmse', 'mae', 'mape', and 'coverage'.

Usage

plot_cross_validation_metric(df_cv, metric, rolling_window = 0.1)

Arguments

df_cv

The output from fbprophet.diagnostics.cross_validation.

metric

Metric name, one of 'mse', 'rmse', 'mae', 'mape', 'coverage'.

rolling_window

Proportion of data to use for rolling average of metric. In [0, 1]. Defaults to 0.1.

Details

rolling_window is the proportion of data included in the rolling window of aggregation. The default value of 0.1 means 10 aggregation for computing the metric.

As a concrete example, if metric='mse', then this plot will show the squared error for each cross validation prediction, along with the MSE averaged over rolling windows of 10

Value

A ggplot2 plot.


prophet

Automatic Forecasting Procedure

v1.0
MIT + file LICENSE
Authors
Sean Taylor [cre, aut], Ben Letham [aut]
Initial release
2021-03-08

We don't support your browser anymore

Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.