Skip to content

iter_evaluate

Evaluates the performance of a forecaster on a time series dataset and yields results.

This does exactly the same as evaluate.progressive_val_score. The only difference is that this function returns an iterator, yielding results at every step. This can be useful if you want to have control over what you do with the results. For instance, you might want to plot the results.

Parameters

  • dataset

    Typebase.typing.Dataset

    A sequential time series.

  • model

    Typetime_series.base.Forecaster

    A forecaster.

  • metric

    Typemetrics.base.RegressionMetric

    A regression metric.

  • horizon

    Typeint

  • agg_func

    Typetyping.Callable[[list[float]], float] | None

    DefaultNone

  • grace_period

    Typeint | None

    DefaultNone

    Initial period during which the metric is not updated. This is to fairly evaluate models which need a warming up period to start producing meaningful forecasts. The value of this parameter is equal to the horizon by default.