LogLoss¶
Binary logarithmic loss.
Attributes¶
-
bigger_is_better
Indicate if a high value is better than a low one or not.
-
requires_labels
Indicates if labels are required, rather than probabilities.
-
sample_correction
-
works_with_weights
Indicate whether the model takes into consideration the effect of sample weights
Examples¶
>>> from river import metrics
>>> y_true = [True, False, False, True]
>>> y_pred = [0.9, 0.1, 0.2, 0.65]
>>> metric = metrics.LogLoss()
>>> for yt, yp in zip(y_true, y_pred):
... metric = metric.update(yt, yp)
... print(metric.get())
0.105360
0.105360
0.144621
0.216161
>>> metric
LogLoss: 0.216162
Methods¶
clone
Return a fresh estimator with the same parameters.
The clone has the same parameters but has not been updated with any data. This works by looking at the parameters from the class signature. Each parameter is either - recursively cloned if it's a River classes. - deep-copied via copy.deepcopy
if not. If the calling object is stochastic (i.e. it accepts a seed parameter) and has not been seeded, then the clone will not be idempotent. Indeed, this method's purpose if simply to return a new instance with the same input parameters.
get
Return the current value of the metric.
revert
Revert the metric.
Parameters
- y_true (bool)
- y_pred (Union[bool, float, Dict[bool, float]])
- sample_weight – defaults to
1.0
update
Update the metric.
Parameters
- y_true (bool)
- y_pred (Union[bool, float, Dict[bool, float]])
- sample_weight – defaults to
1.0
works_with
Indicates whether or not a metric can work with a given model.
Parameters
- model