Transformers

Transformer Decoder

Introduced by Liu et al. in Generating Wikipedia by Summarizing Long Sequences

Transformer-Decoder is a modification to Transformer-Encoder-Decoder for long sequences that drops the encoder module, combines the input and output sequences into a single ”sentence” and is trained as a standard language model. It is used in GPT and later revisions.

Source: Generating Wikipedia by Summarizing Long Sequences

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Document Summarization 1 25.00%
Extractive Summarization 1 25.00%
Multi-Document Summarization 1 25.00%
Sentence 1 25.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories