Learn how to write a custom learning rate scheduler.
cerebras.pytorch.optim.lr_scheduler.LRScheduler
.
For example:
LRScheduler
class expects three arguments:
_get_closed_form
abstract method must be overridden. This method is where the full scheduler is defined in closed form.
torch.where
) may be used.torch.Tensor
that represents the full learning rate schedule as a computed tensor.
See the existing LR scheduler implementations for examples of how to correctly define the schedule.
Once you’ve written your custom LR scheduler, as long as its available in the global scope, you can use it in ModelZoo in a similar way by setting the scheduler
to be the name of your custom LR schedule class in your params YAML file.