composer.models.bert.model#

Implements a BERT wrapper around a ComposerTransformer.

Classes

BERTModel

BERT model based on ๐Ÿค— Transformers.

class composer.models.bert.model.BERTModel(module, config, tokenizer=None)[source]#

Bases: composer.models.transformer_shared.ComposerTransformer

BERT model based on ๐Ÿค— Transformers.

For more information, see Transformers.

Parameters
  • module (BertModel) โ€“ An instance of BertModel that contains the forward pass function.

  • config (BertConfig) โ€“ The BertConfig object that stores information about the model hyperparameters.

  • tokenizer (BertTokenizer) โ€“ An instance of BertTokenizer. Necessary to process model inputs.

To create a BERT model for Language Model pretraining:

from composer.models import BERTModel
import transformers

config = transformers.BertConfig()
hf_model = transformers.BertLMHeadModel(config=config)
tokenizer = transformers.BertTokenizer.from_pretrained("bert-base-uncased")
model = BERTModel(module=hf_model, config=config, tokenizer=tokenizer)
validate(batch)[source]#

Runs the validation step.

Parameters

batch (Dict) โ€“ a dictionary of Dict[str, Tensor] of inputs that the model expects, as found in ComposerTransformer.get_model_inputs().

Returns

tuple (Tensor, Tensor) โ€“ with the output from the forward pass and the correct labels. This is fed into directly into the output of ComposerModel.metrics().