Skip to content

SoftmaxRegression

Softmax regression is a generalization of logistic regression to multiple classes.

Softmax regression is also known as "multinomial logistic regression". There are a set weights for each class, hence the weights attribute is a nested collections.defaultdict. The main advantage of using this instead of a one-vs-all logistic regression is that the probabilities will be calibrated. Moreover softmax regression is more robust to outliers.

Parameters

  • optimizer

    Typeoptim.base.Optimizer | None

    DefaultNone

    The sequential optimizer used to tune the weights.

  • loss

    Typeoptim.losses.MultiClassLoss | None

    DefaultNone

    The loss function to optimize for.

  • l2

    Default0

    Amount of L2 regularization used to push weights towards 0.

Attributes

  • weights (collections.defaultdict)

Examples

from river import datasets
from river import evaluate
from river import linear_model
from river import metrics
from river import optim
from river import preprocessing

dataset = datasets.ImageSegments()

model = preprocessing.StandardScaler()
model |= linear_model.SoftmaxRegression()

metric = metrics.MacroF1()

evaluate.progressive_val_score(dataset, model, metric)
MacroF1: 81.88%

Methods

learn_one

Update the model with a set of features x and a label y.

Parameters

  • x'dict'
  • y'base.typing.ClfTarget'

Returns

Classifier: self

predict_one

Predict the label of a set of features x.

Parameters

  • x'dict'
  • kwargs

Returns

base.typing.ClfTarget | None: The predicted label.

predict_proba_one

Predict the probability of each label for a dictionary of features x.

Parameters

  • x'dict'

Returns

dict[base.typing.ClfTarget, float]: A dictionary that associates a probability which each label.