๐Ÿ’พ Installation#

Composer is available with pip:

pip install mosaicml

as well as with Anaconda:

conda install -c mosaicml mosaicml

To include non-core dependencies that are required by some algorithms, callbacks, datasets, or models, the following installation targets are available:

  • pip install mosaicml[dev]: Installs development dependencies, which are required for running tests and building documentation.

  • pip install mosaicml[deepspeed]: Installs Composer with support for deepspeed.

  • pip install mosaicml[nlp]: Installs Composer with support for NLP models and algorithms.

  • pip install mosaicml[unet]: Installs Composer with support for Unet.

  • pip install mosaicml[timm]: Installs Composer with support for timm.

  • pip install mosaicml[wandb]: Installs Composer with support for wandb.

  • pip install mosaicml[all]: Install all optional dependencies.

For a developer install, clone directly:

git clone https://github.com/mosaicml/composer.git
cd composer
pip install -e .[all]

Note

For fast loading of image data, we highly recommend installing Pillow-SIMD. To install, vanilla pillow must first be uninstalled.

pip uninstall pillow && pip install pillow-simd

Pillow-SIMD is not supported for Apple M1 Macs.

Docker#

To simplify the environment setup for Composer, we provide a set of convenient Docker Images:

Linux Distro

PyTorch Version

CUDA Version

Python Version

Docker Tag

ubuntu:20.04

1.10.0

11.3.1

3.9

latest, mosaicml/pytorch:1.10.0_cu113-python3.9-ubuntu20.04

ubuntu:20.04

1.10.0

cpu

3.9

mosaicml/pytorch:1.10.0_cpu-python3.9-ubuntu20.04

ubuntu:18.04

1.9.1

11.1.1

3.8

mosaicml/pytorch:1.9.1_cu111-python3.8-ubuntu18.04

ubuntu:18.04

1.9.1

cpu

3.8

mosaicml/pytorch:1.9.1_cpu-python3.8-ubuntu18.04

ubuntu:20.04

1.9.1

11.1.1

3.8

mosaicml/pytorch:1.9.1_cu111-python3.8-ubuntu20.04

ubuntu:20.04

1.9.1

cpu

3.8

mosaicml/pytorch:1.9.1_cpu-python3.8-ubuntu20.04

ubuntu:20.04

1.9.1

11.1.1

3.7

mosaicml/pytorch:1.9.1_cu111-python3.7-ubuntu20.04

ubuntu:20.04

1.9.1

cpu

3.7

mosaicml/pytorch:1.9.1_cpu-python3.7-ubuntu20.04

Our latest image has Ubuntu 20.04, Python 3.9, PyTorch 1.10, and CUDA 11.3 and has been tested to work with GPU-based instances on AWS, GCP, and Azure. Pillow-SIMD is installed by default in all images.

Note

These images do not come with Composer preinstalled. To install Composer, run pip install mosaicml once inside the image.

Pre-built images can be pulled from MosaicMLโ€™s DockerHub Repository:

docker pull mosaicml/pytorch

Building images locally#

# Build the default image
make

# Build composer with Python 3.8
PYTHON_VERSION=3.8 make

Note

Docker must be installed on your local machine.

๐Ÿš€ Quick Start#

Access our library of speedup methods with the ฦ’() Functional API:

import logging
from composer import functional as CF
import torchvision.models as models

logging.basicConfig(level=logging.INFO)
model = models.resnet50()

CF.apply_blurpool(model)

This creates a ResNet50 model and replaces several pooling and convolution layers with BlurPool variants (Zhang et al, 2019). For more information, see ๐ŸŠ BlurPool. The method should log:

Applied BlurPool to model ResNet. Model now has 1 BlurMaxPool2d and 6 BlurConv2D layers.

These methods are easy to integrate into your own training loop code with just a few lines.

For an overview of the algorithms, see ๐Ÿค– Algorithms.

We make composing recipes together even easier with our (optional) :class`.Trainer`. Here, we train an MNIST classifer with a recipe of methods:

from torchvision import datasets, transforms
from torch.utils.data import DataLoader

from composer import Trainer
from composer.models import MNIST_Classifier
from composer.algorithms import LabelSmoothing, CutMix, ChannelsLast

transform = transforms.Compose([transforms.ToTensor()])
dataset = datasets.MNIST("data", train=True, download=True, transform=transform)
train_dataloader = DataLoader(dataset, batch_size=128)

trainer = Trainer(
    model=MNIST_Classifier(num_classes=10),
    train_dataloader=train_dataloader,
    max_duration="2ep",
    algorithms=[
        LabelSmoothing(smoothing=0.1),
        CutMix(num_classes=10),
        ChannelsLast(),
        ]
)
trainer.fit()

We handle inserting and running the logic during the training so that any algorithms you specify โ€œjust work.โ€

Besides easily running our built-in algorithms, Composer also features:

  • An interface to flexibly add algorithms to the training loop

  • An engine that manages the ordering of algorithms for composition

  • A trainer to handle boilerplate around numerics, distributed training, and others

  • Integration with popular model libraries such as TIMM and HuggingFace Transformers

Next steps#