Numeric attribute observer for classification tasks that is based on a Binary Search Tree.
This splitter cannot perform probability density estimations, so it does not work well when coupled with tree leaves using naive bayes models.
Determine whether or not the splitter works with numerical features.
Check on which kind of learning task the splitter is designed to work. If
True, the splitter works with classification trees, otherwise it is designed for regression trees.
Get the best split suggestion given a criterion and the target's statistics.
- criterion (river.tree._split_criterion.base_split_criterion.SplitCriterion)
- pre_split_dist (Union[List, Dict])
- att_idx (Hashable)
- binary_only (bool)
SplitSuggestion: Suggestion of the best attribute split.
Return a fresh estimator with the same parameters.
The clone has the same parameters but has not been updated with any data. This works by looking at the parameters from the class signature. Each parameter is either - recursively cloned if it's a River classes. - deep-copied via
copy.deepcopy if not. If the calling object is stochastic (i.e. it accepts a seed parameter) and has not been seeded, then the clone will not be idempotent. Indeed, this method's purpose if simply to return a new instance with the same input parameters.
The underlying data structure used to monitor the input does not allow probability density estimations. Hence, it always returns zero for any given input.
- target_val (Union[bool, str, int])
Update statistics of this observer given an attribute value, its target value and the weight of the instance observed.
- target_val (Union[bool, str, int, numbers.Number])
- sample_weight (float)
Domingos, P. and Hulten, G., 2000, August. Mining high-speed data streams. In Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining (pp. 71-80). ↩
Pfahringer, B., Holmes, G. and Kirkby, R., 2008, May. Handling numeric attributes in hoeffding trees. In Pacific-Asia Conference on Knowledge Discovery and Data Mining (pp. 296-307). Springer, Berlin, Heidelberg. ↩