Adam¶
Adam optimizer.
Parameters¶
-
lr – defaults to
0.1
-
beta_1 – defaults to
0.9
-
beta_2 – defaults to
0.999
-
eps – defaults to
1e-08
Attributes¶
-
m (collections.defaultdict)
-
v (collections.defaultdict)
Examples¶
>>> from river import datasets
>>> from river import evaluate
>>> from river import linear_model
>>> from river import metrics
>>> from river import optim
>>> from river import preprocessing
>>> dataset = datasets.Phishing()
>>> optimizer = optim.Adam()
>>> model = (
... preprocessing.StandardScaler() |
... linear_model.LogisticRegression(optimizer)
... )
>>> metric = metrics.F1()
>>> evaluate.progressive_val_score(dataset, model, metric)
F1: 86.50%
Methods¶
look_ahead
Updates a weight vector before a prediction is made.
Parameters: w (dict): A dictionary of weight parameters. The weights are modified in-place. Returns: The updated weights.
Parameters
- w ('dict')
step
Updates a weight vector given a gradient.
Parameters
- w ('dict | VectorLike')
- g ('dict | VectorLike')
Returns
dict | VectorLike: The updated weights.