Skip to content

AdaBound

AdaBound optimizer.

Parameters

  • lr – defaults to 0.001

    The learning rate.

  • beta_1 – defaults to 0.9

  • beta_2 – defaults to 0.999

  • eps – defaults to 1e-08

  • gamma – defaults to 0.001

  • final_lr – defaults to 0.1

Attributes

  • m (collections.defaultdict)

  • s (collections.defaultdict)

Examples

>>> from river import datasets
>>> from river import evaluate
>>> from river import linear_model
>>> from river import metrics
>>> from river import optim
>>> from river import preprocessing

>>> dataset = datasets.Phishing()
>>> optimizer = optim.AdaBound()
>>> model = (
...     preprocessing.StandardScaler() |
...     linear_model.LogisticRegression(optimizer)
... )
>>> metric = metrics.F1()

>>> evaluate.progressive_val_score(dataset, model, metric)
F1: 87.90%

Methods

look_ahead

Updates a weight vector before a prediction is made.

Parameters: w (dict): A dictionary of weight parameters. The weights are modified in-place. Returns: The updated weights.

Parameters

  • w ('dict')
step

Updates a weight vector given a gradient.

Parameters

  • w ('dict | VectorLike')
  • g ('dict | VectorLike')

Returns

dict | VectorLike: The updated weights.

References