composer.algorithms.ghost_batchnorm.ghost_batchnorm#

composer.algorithms.ghost_batchnorm.ghost_batchnorm

Functions

apply_ghost_batchnorm

Replace batch normalization modules with ghost batch normalization modules.

Classes

Algorithm

Base class for algorithms.

Event

Enum to represent events in the training loop.

GhostBatchNorm

Replaces batch normalization modules with Ghost Batch Normalization modules that simulate the effect of using a smaller batch size.

GhostBatchNorm1d

composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm1d

GhostBatchNorm2d

composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm2d

GhostBatchNorm3d

composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm3d

Logger

Logger routes metrics to the LoggerCallback.

State

The state of the trainer.

Attributes

  • Optimizers

  • Optional

  • annotations

  • log

class composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm(ghost_batch_size=32)[source]#

Bases: composer.core.algorithm.Algorithm

Replaces batch normalization modules with Ghost Batch Normalization modules that simulate the effect of using a smaller batch size.

Works by spliting input into chunks of ghost_batch_size samples and running batch normalization on each chunk separately. Dim 0 is assumed to be the sample axis.

Runs on INIT.

Parameters

ghost_batch_size (int, optional) โ€“ size of sub-batches to normalize over. Default: 32.

apply(event, state, logger=None)[source]#

Applies GhostBatchNorm by wrapping existing BatchNorm modules.

match(event, state)[source]#

Runs on INIT.

Parameters
  • event (Event) โ€“ The current event.

  • state (State) โ€“ The current state.

Returns

bool โ€“ True if this algorithm should run

composer.algorithms.ghost_batchnorm.ghost_batchnorm.apply_ghost_batchnorm(model, ghost_batch_size=32, optimizers=None)[source]#

Replace batch normalization modules with ghost batch normalization modules.

Ghost batch normalization modules split their input into chunks of ghost_batch_size samples and run batch normalization on each chunk separately. Dim 0 is assumed to be the sample axis.

Parameters
  • model (Module) โ€“ the model to modify in-place

  • ghost_batch_size (int, optional) โ€“ size of sub-batches to normalize over. Default: 32.

  • optimizers (Optimizers, optional) โ€“

    Existing optimizers bound to model.parameters(). All optimizers that have already been constructed with model.parameters() must be specified here so they will optimize the correct parameters.

    If the optimizer(s) are constructed after calling this function, then it is safe to omit this parameter. These optimizers will see the correct model parameters.

Returns

The modified model

Example

import composer.functional as cf
from torchvision import models
model = models.resnet50()
cf.apply_ghost_batchnorm(model)