composer.algorithms.functional.scale_scheduler
- composer.algorithms.functional.scale_scheduler(scheduler: torch.optim.lr_scheduler._LRScheduler, ssr: float, orig_max_epochs: Optional[int] = None)[source]
Makes a learning rate schedule take a different number of epochs.
See
ScaleSchedule
for more information.- Parameters
scheduler –
A learning rate schedule object. Must be one of:
torch.optim.lr_scheduler.CosineAnnealingLR
torch.optim.lr_scheduler.CosineAnnealingWarmRestarts
torch.optim.lr_scheduler.ExponentialLR
torch.optim.lr_scheduler.MultiStepLR
torch.optim.lr_scheduler.StepLR
ssr – the factor by which to scale the duration of the schedule. E.g., 0.5 makes the schedule take half as many epochs and 2.0 makes it take twice as many epochs.
orig_max_epochs – the current number of epochs spanned by
scheduler
. Used along withssr
to determine the new number of epochsscheduler
should span.
- Raises
ValueError – If
scheduler
is not an instance of one of the above types.