composer.optim.optimizer_hparams#
Hyperparameters for optimizers.
Hparams
These classes are used with yahp
for YAML
-based configuration.
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Base class for optimizer hyperparameter classes. |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
- class composer.optim.optimizer_hparams.AdamHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.0, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
Adam
optimizer.See
Adam
for documentation.
- class composer.optim.optimizer_hparams.AdamWHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
AdamW
optimizer.See
AdamW
for documentation.
- class composer.optim.optimizer_hparams.DecoupledAdamWHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
DecoupledAdamW
optimizer.See
DecoupledAdamW
for documentation.- Parameters
lr (float, optional) โ See
DecoupledAdamW
.betas (float, optional) โ See
DecoupledAdamW
.eps (float, optional) โ See
DecoupledAdamW
.weight_decay (float, optional) โ See
DecoupledAdamW
.amsgrad (bool, optional) โ See
DecoupledAdamW
.
- class composer.optim.optimizer_hparams.DecoupledSGDWHparams(lr, momentum=0.0, weight_decay=0.0, dampening=0.0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
DecoupledSGDW
optimizer.See
DecoupledSGDW
for documentation.- Parameters
lr (float) โ See
DecoupledSGDW
.momentum (float, optional) โ See
DecoupledSGDW
.weight_decay (float, optional) โ See
DecoupledSGDW
.dampening (float, optional) โ See
DecoupledSGDW
.nesterov (bool, optional) โ See
DecoupledSGDW
.
- class composer.optim.optimizer_hparams.OptimizerHparams[source]#
Bases:
yahp.hparams.Hparams
,abc.ABC
Base class for optimizer hyperparameter classes.
Optimizer parameters that are added to
TrainerHparams
(e.g. via YAML or the CLI) are initialized in the training loop.
- class composer.optim.optimizer_hparams.RAdamHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.0)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
RAdam
optimizer.See
RAdam
for documentation.
- class composer.optim.optimizer_hparams.RMSpropHparams(lr, alpha=0.99, eps=1e-08, momentum=0.0, weight_decay=0.0, centered=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
RMSprop
optimizer.See
RMSprop
for documentation.
- class composer.optim.optimizer_hparams.SGDHparams(lr, momentum=0.0, weight_decay=0.0, dampening=0.0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparams
Hyperparameters for the
SGD
optimizer.See
SGD
for documentation.