Skip to content

StaticQuantizer

Quantization strategy originally used in Stochastic Gradient Trees (SGT)1.

Firstly, a buffer of size warm_start is stored. The data stored in the buffer is then used to quantize the input feature into n_bins intervals. These intervals will be replicated to every new quantizer. Feature values lying outside of the limits defined by the initial buffer will be mapped to the head or tail of the list of intervals.

Parameters

  • n_bins (int) – defaults to 64

    The number of bins (intervals) to divide the input feature.

  • warm_start (int) – defaults to 100

    The number of observations used to initialize the quantization intervals.

  • buckets (List) – defaults to None

    This parameter is only used internally by the quantizer, so it must not be set. Once the intervals are defined, new instances of this quantizer will receive the quantization information via this parameter.

Methods

clone

Return a fresh estimator with the same parameters.

The clone has the same parameters but has not been updated with any data. This works by looking at the parameters from the class signature. Each parameter is either - recursively cloned if it's a River classes. - deep-copied via copy.deepcopy if not. If the calling object is stochastic (i.e. it accepts a seed parameter) and has not been seeded, then the clone will not be idempotent. Indeed, this method's purpose if simply to return a new instance with the same input parameters.

update

References


  1. Gouk, H., Pfahringer, B., & Frank, E. (2019, October). Stochastic Gradient Trees. In Asian Conference on Machine Learning (pp. 1094-1109).