MinMaxScaler¶
Scales the data to a fixed range from 0 to 1.
Under the hood a running min and a running peak to peak (max - min) are maintained.
Attributes¶
-
min (dict)
Mapping between features and instances of
stats.Min
. -
max (dict)
Mapping between features and instances of
stats.Max
.
Examples¶
>>> import random
>>> from river import preprocessing
>>> random.seed(42)
>>> X = [{'x': random.uniform(8, 12)} for _ in range(5)]
>>> for x in X:
... print(x)
{'x': 10.557707}
{'x': 8.100043}
{'x': 9.100117}
{'x': 8.892842}
{'x': 10.945884}
>>> scaler = preprocessing.MinMaxScaler()
>>> for x in X:
... print(scaler.learn_one(x).transform_one(x))
{'x': 0.0}
{'x': 0.0}
{'x': 0.406920}
{'x': 0.322582}
{'x': 1.0}
Methods¶
learn_one
Update with a set of features x
.
A lot of transformers don't actually have to do anything during the learn_one
step because they are stateless. For this reason the default behavior of this function is to do nothing. Transformers that however do something during the learn_one
can override this method.
Parameters
- x (dict)
Returns
Transformer: self
transform_one
Transform a set of features x
.
Parameters
- x (dict)
Returns
dict: The transformed values.