broadcast#

composer.utils.dist.broadcast(tensor, src)[source]#

Broadcasts the tensor to the whole group.

tensor must have the same number of elements in all processes participating in the collective. See torch.distributed.broadcast().

Parameters
  • tensor (Tensor) โ€“ Data to be sent if src is the rank of current process, and tensor to be used to save received data otherwise.

  • src (int) โ€“ Source rank