composer.optim.optimizer_hparams_registry#
Hyperparameters for optimizers.
Hparams
These classes are used with yahp
for YAML
-based configuration.
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Base class for optimizer hyperparameter classes. |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
- class composer.optim.optimizer_hparams_registry.AdamHparams(lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams_registry.OptimizerHparams
Hyperparameters for the
Adam
optimizer.See
Adam
for documentation.- Parameters
- class composer.optim.optimizer_hparams_registry.AdamWHparams(lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams_registry.OptimizerHparams
Hyperparameters for the
AdamW
optimizer.See
AdamW
for documentation.- Parameters
- class composer.optim.optimizer_hparams_registry.DecoupledAdamWHparams(lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams_registry.OptimizerHparams
Hyperparameters for the
DecoupledAdamW
optimizer.See
DecoupledAdamW
for documentation.- Parameters
lr (float, optional) โ See
DecoupledAdamW
.betas (float, optional) โ See
DecoupledAdamW
.eps (float, optional) โ See
DecoupledAdamW
.weight_decay (float, optional) โ See
DecoupledAdamW
.amsgrad (bool, optional) โ See
DecoupledAdamW
.
- optimizer_cls[source]#
alias of
composer.optim.decoupled_weight_decay.DecoupledAdamW
- class composer.optim.optimizer_hparams_registry.DecoupledSGDWHparams(lr=<required parameter>, momentum=0, weight_decay=0, dampening=0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams_registry.OptimizerHparams
Hyperparameters for the
DecoupledSGDW
optimizer.See
DecoupledSGDW
for documentation.- Parameters
lr (float) โ See
DecoupledSGDW
.momentum (float, optional) โ See
DecoupledSGDW
.weight_decay (float, optional) โ See
DecoupledSGDW
.dampening (float, optional) โ See
DecoupledSGDW
.nesterov (bool, optional) โ See
DecoupledSGDW
.
- optimizer_cls[source]#
alias of
composer.optim.decoupled_weight_decay.DecoupledSGDW
- class composer.optim.optimizer_hparams_registry.OptimizerHparams[source]#
Bases:
yahp.hparams.Hparams
,abc.ABC
Base class for optimizer hyperparameter classes.
Optimizer parameters that are added to
TrainerHparams
(e.g. via YAML or the CLI) are initialized in the training loop.
- class composer.optim.optimizer_hparams_registry.RAdamHparams(lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0)[source]#
Bases:
composer.optim.optimizer_hparams_registry.OptimizerHparams
Hyperparameters for the
RAdam
optimizer.See
RAdam
for documentation.- Parameters
- class composer.optim.optimizer_hparams_registry.RMSpropHparams(lr=0.01, alpha=0.99, eps=1e-08, momentum=0, weight_decay=0, centered=False)[source]#
Bases:
composer.optim.optimizer_hparams_registry.OptimizerHparams
Hyperparameters for the
RMSprop
optimizer.See
RMSprop
for documentation.- Parameters
- class composer.optim.optimizer_hparams_registry.SGDHparams(lr=<required parameter>, momentum=0, weight_decay=0, dampening=0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams_registry.OptimizerHparams
Hyperparameters for the
SGD
optimizer.See
SGD
for documentation.- Parameters