Transformer-Decoder is a modification to Transformer-Encoder-Decoder for long sequences that drops the encoder module, combines the input and output sequences into a single ”sentence” and is trained as a standard language model. It is used in GPT and later revisions.
Source: Generating Wikipedia by Summarizing Long SequencesPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Document Summarization | 1 | 25.00% |
Extractive Summarization | 1 | 25.00% |
Multi-Document Summarization | 1 | 25.00% |
Sentence | 1 | 25.00% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |