FunkMF¶
Funk Matrix Factorization for recommender systems.
The model equation is defined as:
where \(k\) is the number of latent factors.
This model expects a dict input with a user
and an item
entries without any type constraint on their values (i.e. can be strings or numbers). Other entries are ignored.
Parameters¶
-
n_factors – defaults to
10
Dimensionality of the factorization or number of latent factors.
-
optimizer (optim.base.Optimizer) – defaults to
None
The sequential optimizer used for updating the latent factors.
-
loss (optim.base.Loss) – defaults to
None
The loss function to optimize for.
-
l2 – defaults to
0.0
Amount of L2 regularization used to push weights towards 0.
-
initializer (optim.base.Initializer) – defaults to
None
Latent factors initialization scheme.
-
clip_gradient – defaults to
1000000000000.0
Clips the absolute value of each gradient value.
-
seed – defaults to
None
Random number generation seed. Set this for reproducibility.
Attributes¶
-
u_latents (collections.defaultdict)
The user latent vectors randomly initialized.
-
i_latents (collections.defaultdict)
The item latent vectors randomly initialized.
-
u_optimizer (optim.base.Optimizer)
The sequential optimizer used for updating the user latent weights.
-
i_optimizer (optim.base.Optimizer)
The sequential optimizer used for updating the item latent weights.
Examples¶
>>> from river import optim
>>> from river import reco
>>> dataset = (
... ({'user': 'Alice', 'item': 'Superman'}, 8),
... ({'user': 'Alice', 'item': 'Terminator'}, 9),
... ({'user': 'Alice', 'item': 'Star Wars'}, 8),
... ({'user': 'Alice', 'item': 'Notting Hill'}, 2),
... ({'user': 'Alice', 'item': 'Harry Potter'}, 5),
... ({'user': 'Bob', 'item': 'Superman'}, 8),
... ({'user': 'Bob', 'item': 'Terminator'}, 9),
... ({'user': 'Bob', 'item': 'Star Wars'}, 8),
... ({'user': 'Bob', 'item': 'Notting Hill'}, 2)
... )
>>> model = reco.FunkMF(
... n_factors=10,
... optimizer=optim.SGD(0.1),
... initializer=optim.initializers.Normal(mu=0., sigma=0.1, seed=11),
... )
>>> for x, y in dataset:
... _ = model.learn_one(**x, y=y)
>>> model.predict_one(user='Bob', item='Harry Potter')
1.866272
Methods¶
clone
Return a fresh estimator with the same parameters.
The clone has the same parameters but has not been updated with any data. This works by looking at the parameters from the class signature. Each parameter is either - recursively cloned if it's a River classes. - deep-copied via copy.deepcopy
if not. If the calling object is stochastic (i.e. it accepts a seed parameter) and has not been seeded, then the clone will not be idempotent. Indeed, this method's purpose if simply to return a new instance with the same input parameters.
learn_one
Fits a user
-item
pair and a real-valued target y
.
Parameters
- user (Union[str, int])
- item (Union[str, int])
- y (Union[numbers.Number, bool])
- x (dict) – defaults to
None
predict_one
Predicts the target value of a set of features x
.
Parameters
- user (Union[str, int])
- item (Union[str, int])
- x (dict) – defaults to
None
Returns
typing.Union[numbers.Number, bool]: The predicted preference from the user for the item.
rank
Rank models by decreasing order of preference for a given user.
Parameters
- user (Union[str, int])
- items (Set[Union[str, int]])
- x (dict) – defaults to
None