composer.algorithms.functional.freeze_layers

composer.algorithms.functional.freeze_layers(model: torch.nn.modules.module.Module, optimizers: Union[torch.optim.optimizer.Optimizer, Tuple[torch.optim.optimizer.Optimizer, ...]], current_epoch: int, max_epochs: int, freeze_start: float, freeze_level: float, logger: Optional[composer.core.logging.logger.Logger] = None) torch.nn.modules.module.Module[source]

Progressively freeze the layers of the network during training, starting with the earlier layers.

Parameters
  • model – an instance of the model being trained

  • optimizers – the optimizers used during training

  • current_epoch – integer specifying the current epoch

  • max_epochs – the max number of epochs training will run for

  • freeze_start – the fraction of epochs to run before freezing begins

  • freeze_level – the maximum fraction of layers to freeze