Skip to content

TargetLagger

Uses lagged values of the target as features.

Parameters

  • lags (Union[int, Tuple[int]])

    Indicates which lags to compute for each feature. This may be specified as a single number, or as a tuple of numbers.

  • by (Union[str, List[str], NoneType]) – defaults to None

    An optional feature by which to group the lagged values.

  • drop_nones – defaults to True

    Whether or not features should be included with a None value if not enough values have been seen yet.

  • target_name – defaults to y

    The target name which is used in the result.

Examples

Consider the following dataset, where the second value of each value is the target:

>>> dataset = [
...     ({'country': 'France'}, 42),
...     ({'country': 'Sweden'}, 16),
...     ({'country': 'France'}, 24),
...     ({'country': 'Sweden'}, 58),
...     ({'country': 'Sweden'}, 20),
...     ({'country': 'France'}, 50),
...     ({'country': 'France'}, 10),
...     ({'country': 'Sweden'}, 80)
... ]

Let's extract the two last values of the target at each time step.

>>> from river.feature_extraction import TargetLagger

>>> lagger = TargetLagger(lags=[1, 2])
>>> for x, y in dataset:
...     print(lagger.transform_one(x))
...     lagger = lagger.learn_one(x, y)
{}
{}
{'y-1': 42}
{'y-1': 16, 'y-2': 42}
{'y-1': 24, 'y-2': 16}
{'y-1': 58, 'y-2': 24}
{'y-1': 20, 'y-2': 58}
{'y-1': 50, 'y-2': 20}

We can also calculate the lags with different groups:

>>> lagger = TargetLagger(lags=[1, 2], by=['country'])
>>> for x, y in dataset:
...     print(lagger.transform_one(x))
...     lagger = lagger.learn_one(x, y)
{}
{}
{}
{}
{'y-1_by_country': 16}
{'y-1_by_country': 42}
{'y-1_by_country': 24, 'y-2_by_country': 42}
{'y-1_by_country': 58, 'y-2_by_country': 16}

Methods

clone

Return a fresh estimator with the same parameters.

The clone has the same parameters but has not been updated with any data. This works by looking at the parameters from the class signature. Each parameter is either - recursively cloned if it's a River classes. - deep-copied via copy.deepcopy if not. If the calling object is stochastic (i.e. it accepts a seed parameter) and has not been seeded, then the clone will not be idempotent. Indeed, this method's purpose if simply to return a new instance with the same input parameters.

learn_one

Update with a set of features x and a target y.

Parameters

  • x (dict)
  • y (Union[bool, str, int, numbers.Number])

Returns

SupervisedTransformer: self

transform_one

Transform a set of features x.

Parameters

  • x (dict)

Returns

dict: The transformed values.