Skip to content

ConfusionMatrix

Confusion Matrix for binary and multi-class classification.

Parameters

  • classes

    DefaultNone

    The initial set of classes. This is optional and serves only for displaying purposes.

Attributes

  • bigger_is_better

    Indicate if a high value is better than a low one or not.

  • classes

  • requires_labels

    Indicates if labels are required, rather than probabilities.

  • total_false_negatives

  • total_false_positives

  • total_true_negatives

  • total_true_positives

  • works_with_weights

    Indicate whether the model takes into consideration the effect of sample weights

Examples

from river import metrics

y_true = ['cat', 'ant', 'cat', 'cat', 'ant', 'bird']
y_pred = ['ant', 'ant', 'cat', 'cat', 'ant', 'cat']

cm = metrics.ConfusionMatrix()

for yt, yp in zip(y_true, y_pred):
    cm.update(yt, yp)

cm
       ant  bird   cat
 ant     2     0     0
bird     0     0     1
 cat     1     0     2

cm['bird']['cat']
1.0

Methods

false_negatives
false_positives
get

Return the current value of the metric.

is_better_than

Indicate if the current metric is better than another one.

Parameters

  • other

revert

Revert the metric.

Parameters

  • y_true
  • y_pred
  • w — defaults to 1.0

support
true_negatives
true_positives
update

Update the metric.

Parameters

  • y_true
  • y_pred
  • w — defaults to 1.0

works_with

Indicates whether or not a metric can work with a given model.

Parameters

Notes

This confusion matrix is a 2D matrix of shape (n_classes, n_classes), corresponding to a single-target (binary and multi-class) classification task.

Each row represents true (actual) class-labels, while each column corresponds to the predicted class-labels. For example, an entry in position [1, 2] means that the true class-label is 1, and the predicted class-label is 2 (incorrect prediction).

This structure is used to keep updated statistics about a single-output classifier's performance and to compute multiple evaluation metrics.