lenskit.metrics.ranking.Precision ================================= .. py:class:: lenskit.metrics.ranking.Precision(n = None, *, k = None) :canonical: lenskit.metrics.ranking._pr.Precision Bases: :py:obj:`lenskit.metrics.ranking._base.ListMetric`, :py:obj:`lenskit.metrics.ranking._base.RankingMetricBase` Compute recommendation precision. This is computed as: .. math:: \frac{|L \cap I_u^{\mathrm{test}}|}{|L|} In the uncommon case that ``k`` is specified and ``len(recs) < k``, this metric uses ``len(recs)`` as the denominator. :Stability: Caller .. py:property:: label The metric's default label in output. The base implementation returns the class name by default. .. py:method:: measure_list(recs, test) Compute measurements for a single list. :returns: - A float for simple metrics - Intermediate data for decomposed metrics - A dict mapping metric names to values for multi-metric classes