composer.algorithms.functional.freeze_layers

composer.algorithms.functional.freeze_layers(model, optimizers, current_epoch, max_epochs, freeze_start, freeze_level, logger)[source]

Implements the layer freezing algorithm. During training, progressively freeze the layers of the network starting with the earlier layers.

Parameters
  • model (torch.nn.modules.module.Module) – An instance of the model being trained.

  • optimizers (Union[torch.optim.optimizer.Optimizer, Tuple[torch.optim.optimizer.Optimizer, ...]]) – The optimizers used during training.

  • current_epoch (int) – Integer specifying the current epoch.

  • max_epochs (int) – The max number of epochs training will run for.

  • freeze_start (float) – The fraction of epochs to run before freezing begins.

  • freeze_level (float) – The maximum fraction of levels to freeze.

  • logger (composer.core.logging.logger.Logger) –