composer.optim#
Modules
composer.optim.decoupled_weight_decay |
|
composer.optim.optimizer_hparams |
|
composer.optim.scheduler |
composer.optim
Classes
Adam optimizer with the weight decay term decoupled from the learning rate. |
|
SGD optimizer with the weight decay term decoupled from the learning rate. |
Hparams
These classes are used with yahp
for YAML
-based configuration.
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Abstract base class for optimizer hyperparameter classes. |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the [RMSProp optimizer](https://pytorch.org/docs/stable/generated/torch.optim.RMSprop.html#torch.optim.RMSprop). |
|
Hyperparameters for the SGD optimizer. |
|
composer.optim.scheduler.SchedulerHparams |
|
Hyperparameters for the |