composer.optim.optimizer_hparams#
composer.optim.optimizer_hparams
Functions
A decorator indicating abstract methods. |
|
Return the fields of a dataclass instance as a new dictionary mapping field names to field values. |
|
Returns the same class as was passed in, with dunder methods added based on the fields defined in the class. |
|
Get the optimizer specified by the given hyperparameters. |
Classes
Helper class that provides a standard way to create an ABC using inheritance. |
|
Adam optimizer with the weight decay term decoupled from the learning rate. |
|
SGD optimizer with the weight decay term decoupled from the learning rate. |
|
|
Base class for all optimizers. |
Hparams
These classes are used with yahp
for YAML
-based configuration.
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Abstract base class for optimizer hyperparameter classes. |
|
Hyperparameters for the |
|
Hyperparameters for the [RMSProp optimizer](https://pytorch.org/docs/stable/generated/torch.optim.RMSprop.html#torch.optim.RMSprop). |
|
Hyperparameters for the SGD optimizer. |
Attributes
List
ModelParameters
Type
- class composer.optim.optimizer_hparams.AdamHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.0, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
Adam
optimizer.
- class composer.optim.optimizer_hparams.AdamWHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
torch.optim.AdamW
optimizer.
- class composer.optim.optimizer_hparams.DecoupledAdamWHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
DecoupledAdamW
optimizer.
- class composer.optim.optimizer_hparams.DecoupledSGDWHparams(lr, momentum=0.0, weight_decay=0.0, dampening=0.0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
DecoupledSGDW
optimizer.
- class composer.optim.optimizer_hparams.OptimizerHparams[source]#
Bases:
yahp.hparams.Hparams
,abc.ABC
Abstract base class for optimizer hyperparameter classes.
- class composer.optim.optimizer_hparams.RAdamHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.0)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
RAdam
optimizer.
- class composer.optim.optimizer_hparams.RMSPropHparams(lr, alpha=0.99, eps=1e-08, momentum=0.0, weight_decay=0.0, centered=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the [RMSProp optimizer](https://pytorch.org/docs/stable/generated/torch.optim.RMSprop.html#torch.optim.RMSprop).
- class composer.optim.optimizer_hparams.SGDHparams(lr, momentum=0.0, weight_decay=0.0, dampening=0.0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the SGD optimizer.
- composer.optim.optimizer_hparams.get_optimizer(param_groups, hparams)[source]#
Get the optimizer specified by the given hyperparameters.
- Parameters
param_groups (ModelParameters) โ List of model parameters to optimize.
hparams (OptimizerHparams) โ Instance of an optimizerโs hyperparameters.