Skip to content

AdaBoostClassifier

Boosting for classification.

For each incoming observation, each model's learn_one method is called k times where k is sampled from a Poisson distribution of parameter lambda. The lambda parameter is updated when the weaks learners fit successively the same observation.

Parameters

  • model

    Typebase.Classifier

    The classifier to boost.

  • n_models

    Default10

    The number of models in the ensemble.

  • seed

    Typeint | None

    DefaultNone

    Random number generator seed for reproducibility.

Attributes

  • models

Examples

In the following example three tree classifiers are boosted together. The performance is slightly better than when using a single tree.

from river import datasets
from river import ensemble
from river import evaluate
from river import metrics
from river import tree

dataset = datasets.Phishing()

metric = metrics.LogLoss()

model = ensemble.AdaBoostClassifier(
    model=(
        tree.HoeffdingTreeClassifier(
            split_criterion='gini',
            delta=1e-5,
            grace_period=2000
        )
    ),
    n_models=5,
    seed=42
)

evaluate.progressive_val_score(dataset, model, metric)
LogLoss: 0.370805

print(model)
AdaBoostClassifier(HoeffdingTreeClassifier)

Methods

learn_one

Update the model with a set of features x and a label y.

Parameters

  • x
  • y
  • kwargs

Returns

self

predict_one

Predict the label of a set of features x.

Parameters

  • x'dict'
  • kwargs

Returns

base.typing.ClfTarget | None: The predicted label.

predict_proba_one

Predict the probability of each label for a dictionary of features x.

Parameters

  • x
  • kwargs

Returns

A dictionary that associates a probability which each label.