lenskit.tuning#

Tune parameters using Ray Tune.

Submodules#

Classes#

PipelineTuner

Set up and run a hyperparameter tuning job for a pipeline.

Package Contents#

class lenskit.tuning.PipelineTuner(spec, out_dir=None, rng=None)#

Set up and run a hyperparameter tuning job for a pipeline.

Stability:

Experimental

Parameters:
settings: lenskit.config.TuneSettings#
spec: lenskit.tuning.spec.TuningSpec#
out_dir: pathlib.Path#
pipe_name: str | None#
random_seed: numpy.random.SeedSequence#
iterative: bool#
data: lenskit.splitting.TTSplit[lenskit.data.GenericKey]#
harness: Any#
tuner: ray.tune.Tuner#

The Ray tuner that is used for tuning. Not available until setup() has been called.

results: ray.tune.ResultGrid#

Ray tuning results. Only available after run() has been called.

log#
property mode#
set_data(train, test, *, name=None)#

Set the data to be used for tuning.

Parameters:
setup()#

Set up to run the trainer. After this method completes, the tuner is ready.

run()#

Run the tuning job.

Saves the results in results, and also returns them.

Return type:

ray.tune.ResultGrid

best_result(*, scope='all')#

Get the best configuration and its validation metrics.

Parameters:

scope (str) – The metric search scope for iterative training. Set to "last" to use the last iteration instead of the best iteration. See ray.tune.ResultGrid.get_best_result() for details.

Return type:

dict[str, pydantic.JsonValue]

best_pipeline()#

Get the (full) configuration for the best pipeline.

Return type:

lenskit.pipeline.PipelineConfig

search_space()#

Get the Ray search space.

setup_harness()#
property metric#
create_tuner()#

Create a Ray tuner for the search.

Return type:

ray.tune.Tuner

Exported Aliases#

class lenskit.tuning.TuningSpec#

Re-exported alias for lenskit.tuning.spec.TuningSpec.