Skip to content

DynamicQuantizer

Adapted version of the Quantizer Observer (QO)1 that is applied to Stochastic Gradient Trees (SGT).

This feature quantizer starts by partitioning the inputs using the passed radius value. As more splits are created in the SGTs, new feature quantizers will use std * std_prop as the quantization radius. In the expression, std represents the standard deviation of the input data, which is calculated incrementally.

Parameters

  • radius (float) – defaults to 0.5

    The initial quantization radius.

  • std_prop (float) – defaults to 0.25

    The proportion of the standard deviation that is going to be used to define the radius value for new quantizer instances following the initial one.

Methods

clone

Return a fresh estimator with the same parameters.

The clone has the same parameters but has not been updated with any data. This works by looking at the parameters from the class signature. Each parameter is either - recursively cloned if it's a River classes. - deep-copied via copy.deepcopy if not. If the calling object is stochastic (i.e. it accepts a seed parameter) and has not been seeded, then the clone will not be idempotent. Indeed, this method's purpose if simply to return a new instance with the same input parameters.

update

References


  1. Mastelini, S.M. and de Leon Ferreira, A.C.P., 2021. Using dynamical quantization to perform split attempts in online tree regressors. Pattern Recognition Letters.