Reproducibility is an essential component of training ML models.
Trainer
class features a way to enable determinism across runs, if so desired.
On this page, you will learn how to configure the Trainer to ensure reproducibility.
Trainer
supports configuring reproducibility by piggybacking off of torch seed settings. While it is possible to manually set the torch seed outside of the Trainer class, it is strongly recommended to use the seed
argument of the Trainer
class to handle that for you. The following example shows how you can set the seed to 1234:
Trainer
class, you should pass a callable that returns a torch Module. The Trainer will set the seed before invoking the callback, thus ensuring reproducibility. This is in line with deferred weight initialization, as described in Defer Weight Initialization.seed
argument in the Trainer
class, you can achieve deterministic behavior across runs. This guide has provided step-by-step instructions on configuring the Trainer for reproducibility using both YAML and Python.
Trainer
class, you can check out: