ProphetNet is a sequence-to-sequence pre-training model that introduces a novel self-supervised objective named future n-gram prediction and the proposed n-stream self-attention mechanism. Instead of optimizing one-step-ahead prediction in the traditional sequence-to-sequence model, the ProphetNet is optimized by $n$-step ahead prediction that predicts the next $n$ tokens simultaneously based on previous context tokens at each time step. The future n-gram prediction explicitly encourages the model to plan for the future tokens and further help predict multiple future tokens.
Source: ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-trainingPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Abstractive Text Summarization | 4 | 18.18% |
Text Summarization | 4 | 18.18% |
Question Generation | 4 | 18.18% |
Text Generation | 2 | 9.09% |
Event-Driven Trading | 1 | 4.55% |
Future prediction | 1 | 4.55% |
Time Series Analysis | 1 | 4.55% |
Code Generation | 1 | 4.55% |
Open-Domain Dialog | 1 | 4.55% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |