ADWINBoostingClassifier¶
ADWIN Boosting classifier.
ADWIN Boosting 1 is the online boosting method of Oza and Russell 2 with the addition of the ADWIN
algorithm as a change detector. If concept drift is detected, the worst member of the ensemble (based on the error estimation by ADWIN) is replaced by a new (empty) classifier.
Parameters¶
-
model
Type โ base.Classifier
The classifier to boost.
-
n_models
Default โ
10
The number of models in the ensemble.
-
seed
Type โ int | None
Default โ
None
Random number generator seed for reproducibility.
Attributes¶
- models
Examples¶
from river import datasets
from river import ensemble
from river import evaluate
from river import linear_model
from river import metrics
from river import preprocessing
dataset = datasets.Phishing()
model = ensemble.ADWINBoostingClassifier(
model=(
preprocessing.StandardScaler() |
linear_model.LogisticRegression()
),
n_models=3,
seed=42
)
metric = metrics.F1()
evaluate.progressive_val_score(dataset, model, metric)
F1: 87.61%
Methods¶
learn_one
Update the model with a set of features x
and a label y
.
Parameters
- x
- y
- kwargs
Returns
self
predict_one
Predict the label of a set of features x
.
Parameters
- x โ 'dict'
- kwargs
Returns
base.typing.ClfTarget | None: The predicted label.
predict_proba_one
Predict the probability of each label for a dictionary of features x
.
Parameters
- x
- kwargs
Returns
A dictionary that associates a probability which each label.
-
Albert Bifet, Geoff Holmes, Bernhard Pfahringer, Richard Kirkby, and Ricard Gavaldร . "New ensemble methods for evolving data streams." In 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2009. ↩
-
Oza, N., Russell, S. "Online bagging and boosting." In: Artificial Intelligence and Statistics 2001, pp. 105โ112. Morgan Kaufmann, 2001. ↩