Preprint 2019

Generating Long Sequences with Sparse Transformers

Preprint 2019 openai/sparse_attention

Transformers are powerful sequence models, but require time and memory that grows quadratically with the sequence length.

AUDIO GENERATION IMAGE GENERATION LANGUAGE MODELLING