composer.algorithms.functional.apply_se

composer.algorithms.functional.apply_se(model, latent_channels, min_channels)[source]

Adds Squeeze-and-Excitation <https://arxiv.org/abs/1709.01507>`_ (SE) blocks after the Conv2d layers of a neural network.

Parameters
  • model (torch.nn.modules.module.Module) – A module containing one or more torch.nn.Conv2d modules.

  • latent_channels (float) – The dimensionality of the hidden layer within the added MLP.

  • min_channels (int) – An SE block is added after a Conv2d module conv only if min(conv.in_channels, conv.out_channels) >= min_channels. For models that reduce spatial size and increase channel count deeper in the network, this parameter can be used to only add SE blocks deeper in the network. This may be desirable because SE blocks add less overhead when their inputs have smaller spatial size.