lenskit.knn.slim#

Sparse LInear Methods for Recommendation [NK11].

Classes#

SLIMConfig

Hyperparameters for SLIM/fsSLIM (SlimScorer).

SLIMScorer

Item scorer using Sparse LInear Methods (SLIM). SLIM was described for

Module Contents#

class lenskit.knn.slim.SLIMConfig#

Bases: pydantic.BaseModel

Hyperparameters for SLIM/fsSLIM (SlimScorer).

l1_reg: pydantic.PositiveFloat = 1.0#

L₁ regularization strength for SLIM.

l2_reg: pydantic.PositiveFloat = 1.0#

L₂ regularization strength for SLIM.

max_iters: pydantic.PositiveInt = 100#

Maximum coordinate descent iterations per column.

max_nbrs: pydantic.PositiveInt | None = None#

Maximum neighbors (features) for each item. Setting this to a positive integer enables fsSLIM (with item cosine selection).

class lenskit.knn.slim.SLIMScorer(config=None, **kwargs)#

Bases: lenskit.pipeline.components.Component, lenskit.training.Trainable

Item scorer using Sparse LInear Methods (SLIM). SLIM was described for recommendation by :cite:t`ningSLIMSparseLinear2011`. This scorer supports both SLIM and fsSLIM variants (by setting SLIMConfig.max_nbrs).

SLIM operates by learning an elastic-net regression for each item, predicting whether the user will interact with that item based on the other items they have interacted with.

This implementation closely follows the paper with some reference to libslim for computational details. It uses coodrinate descent with soft thresholding [FHT10] to estimate the SLIM weight matrix.

Note

The coordinate descent method of Friedman et al. [FHT10] operates on standardized predictor variables, with a hint that standardization can be inlined to the optimization logic for sparse problems. However, they do not provide details for how to do that inlining, and the original SLIM implementation from the Karypis lab does not appear to center the predictors. Therefore, this implementation also does not center them.

Parameters:
  • config (object | None)

  • kwargs (Any)

config: SLIMConfig#

The component configuration object. Component classes that support configuration must redefine this attribute with their specific configuration class type, which can be a Python dataclass or a Pydantic model class.

weights: scipy.sparse.csr_array#
items: lenskit.data.Vocabulary#
is_trained()#

Query if this component has already been trained.

Return type:

bool

train(data, options)#

Train the model to learn its parameters from a training dataset.

Parameters:
__call__(query, items)#

Run the pipeline’s operation and produce a result. This is the key method for components to implement.

Parameters:
Return type:

lenskit.data.ItemList

Exported Aliases#

class lenskit.knn.slim.Dataset#

Re-exported alias for lenskit.data.Dataset.

class lenskit.knn.slim.ItemList#

Re-exported alias for lenskit.data.ItemList.

class lenskit.knn.slim.RecQuery#

Re-exported alias for lenskit.data.RecQuery.

class lenskit.knn.slim.Vocabulary#

Re-exported alias for lenskit.data.Vocabulary.

class lenskit.knn.slim.SparseRowArray#

Re-exported alias for lenskit.data.matrix.SparseRowArray.

exception lenskit.knn.slim.DataWarning#

Re-exported alias for lenskit.diagnostics.DataWarning.

lenskit.knn.slim.get_logger()#

Re-exported alias for lenskit.logging.get_logger().

lenskit.knn.slim.item_progress()#

Re-exported alias for lenskit.logging.item_progress().

lenskit.knn.slim.ensure_parallel_init()#

Re-exported alias for lenskit.parallel.config.ensure_parallel_init().

class lenskit.knn.slim.Component#

Re-exported alias for lenskit.pipeline.components.Component.

class lenskit.knn.slim.Trainable#

Re-exported alias for lenskit.training.Trainable.

class lenskit.knn.slim.TrainingOptions#

Re-exported alias for lenskit.training.TrainingOptions.