gpt2#

Modules

The GPT-2 model family is set of transformer-based networks for autoregressive language modeling at various scales. This family was originally proposed by OpenAI, and is trained on the OpenWebText dataset. It is useful for downstream language generation tasks, such as summarization, translation, and dialog.

See the Model Card for more details.

Functions

create_gpt2

Implements HuggingFaceModel to wrap Hugging Face GPT-2 transformers. Logs training and

Hparams

These classes are used with yahp for YAML-based configuration.

GPT2Hparams

YAHP interface for GPT2Model.