lenskit.als#
LensKit ALS implementations.
- class lenskit.als.ALSBase(config=None, **kwargs)#
Bases:
UsesTrainer,Component[ItemList, …],ABCBase class for ALS models.
- Stability:
- Caller (see Stability Levels).
- Parameters:
config (ALSConfig)
kwargs (Any)
- finalize_scores(user_num, items, user_bias)#
Perform any final transformation of scores prior to returning them.
- class lenskit.als.ALSConfig(*, embedding_size=50, epochs=10, regularization=0.1, user_embeddings=True)#
Bases:
EmbeddingSizeMixin,BaseModelConfiguration for ALS scorers.
- Parameters:
- model_config: ClassVar[ConfigDict] = {}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- embedding_size: PositiveInt#
The dimension of user and item embeddings (number of latent features to learn).
- epochs: PositiveInt#
The number of epochs to train.
- user_embeddings: bool | Literal['prefer']#
Whether to retain user embeddings after training. If
True, they are retained, but are ignored if the query has historical items; ifFalse, they are not. If set to"prefer", then the user embeddings from training time are used even if the query has a user history. This makes inference faster when histories only consist of the user’s items from the training set.
- class lenskit.als.BiasedMFScorer(config=None, **kwargs)#
Bases:
ALSBaseBiased matrix factorization trained with alternating least squares [ZWSP08]. This is a prediction-oriented algorithm suitable for explicit feedback data, using the alternating least squares approach to compute \(P\) and \(Q\) to minimize the regularized squared reconstruction error of the ratings matrix.
See the base class
ALSBasefor documentation on the estimated parameters you can extract from a trained model. SeeBiasedMFConfigandALSConfigfor the configuration options for this component.- Stability:
- Caller (see Stability Levels).
- Parameters:
config (BiasedMFConfig)
kwargs (Any)
- create_trainer(data, options)#
Create a model trainer to train this model.
- finalize_scores(user_num, items, user_bias)#
Perform any final transformation of scores prior to returning them.
- class lenskit.als.BiasedMFConfig(*, embedding_size=50, epochs=10, regularization=0.1, user_embeddings=True, damping=5.0)#
Bases:
ALSConfig- Parameters:
- model_config: ClassVar[ConfigDict] = {}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- damping: Damping#
Damping for the bias model.
- class lenskit.als.ImplicitMFScorer(config=None, **kwargs)#
Bases:
ALSBaseImplicit matrix factorization trained with alternating least squares [HKV08]. This algorithm outputs ‘predictions’, but they are not on a meaningful scale. If its input data contains
ratingvalues, these will be used as the ‘confidence’ values; otherwise, confidence will be 1 for every rated item.With weight \(w\), this function decomposes the matrix \(\mathbb{1}^* + Rw\), where \(\mathbb{1}^*\) is an \(m \times n\) matrix of all 1s.
See the base class
ALSBasefor documentation on the estimated parameters you can extract from a trained model. SeeImplicitMFConfigandALSConfigfor the configuration options for this component.Changed in version 2025.1:
ImplicitMFScorerno longer supports multiple training methods. It always uses Cholesky decomposition now.Changed in version 0.14: By default,
ImplicitMFignores aratingcolumn if one is present in the training data. This can be changed through theuse_ratingsoption.Changed in version 0.13: In versions prior to 0.13,
ImplicitMFused the rating column if it was present. In 0.13, we added an option to control whether or not the rating column is used; it initially defaulted toTrue, but with a warning. In 0.14 it defaults toFalse.- Stability:
- Caller (see Stability Levels).
- Parameters:
config (ImplicitMFConfig)
kwargs (Any)
- create_trainer(data, options)#
Create a model trainer to train this model.
- class lenskit.als.ImplicitMFConfig(*, embedding_size=50, epochs=10, regularization=0.1, user_embeddings=True, weight=40, use_ratings=False)#
Bases:
ALSConfig- Parameters:
- model_config: ClassVar[ConfigDict] = {}#
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].