lenskit.stochastic#
Components for generating sochastic outputs in LensKit pipelines.
- class lenskit.stochastic.StochasticTopNConfig(n=None, rng=None, transform='softmax', scale=1.0)#
Bases:
objectConfiguration for
StochasticTopNRanker.- Parameters:
- rng: int | Sequence[int] | None | Literal['user'] | tuple[int | Sequence[int] | None, Literal['user']] = None#
Random number generator configuration.
- transform: Literal['softmax', 'linear'] | None = 'softmax'#
Transformation to convert scores into ranking probabilities.
- softmax
Use the softmax of the item scores as the selection probabilities.
- linear
Linearly re-scale item scores to be selection probabilities. This equivalent to min-max scaling the scores, then re-scaling to sum to 1.
NoneNo transformation, except negative scores are clamped to (almost) zero. Not recommended unless your item scorer emits multinomial probabilities.
- class lenskit.stochastic.StochasticTopNRanker(config=None, **kwargs)#
-
Stochastic top-N ranking with optional weight transformation.
This uses the exponential sampling method, a more efficient approximation of Plackett-Luce sampling than even the Gumbell trick, as documented by Tim Vieira. It expects a scored list of input items, and samples
nitems, with selection probabilities proportional to their scores. Scores can be optionally rescaled (inverse temperature) and transformed (e.g. softmax).Note
When no transformation is used, negative scores are still clamped to (approximately) zero.
- Stability:
- Caller (see Stability Levels).
- Parameters:
config (StochasticTopNConfig)