Browse State-of-the-Art
Datasets
Methods
More
Newsletter
RC2022
About
Trends
Portals
Libraries
Sign In
Subscribe to the PwC Newsletter
×
Stay informed on the latest trending ML papers with code, research developments, libraries, methods, and datasets.
Read previous issues
Join the community
×
You need to
log in
to edit.
You can
create a new account
if you don't have one.
Edit Category
×
Description with markdown (optional):
Image
Currently:
method_collections/Screen_Shot_2020-06-01_at_10.36.26_PM_FONaGgE.png
Clear
Change:
Autoregressive Transformers
Edit
Natural Language Processing
•
Transformers
• 13 methods
Methods
Add a Method
Method
Year
Papers
Transformer
Attention Is All You Need
2017
13964
GPT
Improving Language Understanding by Generative Pre-Training
2018
1206
GPT-2
Language Models are Unsupervised Multitask Learners
2019
767
Transformer-XL
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
2019
64
Universal Transformer
Universal Transformers
2018
18
Linformer
Linformer: Self-Attention with Linear Complexity
2020
17
Primer
Primer: Searching for Efficient Transformers for Language Modeling
2021
14
Levenshtein Transformer
Levenshtein Transformer
2019
12
Routing Transformer
Efficient Content-Based Sparse Attention with Routing Transformers
2020
3
Feedback Transformer
Addressing Some Limitations of Transformers with Feedback Memory
2020
2
Sandwich Transformer
Improving Transformer Models by Reordering their Sublayers
2019
2
Sinkhorn Transformer
Sparse Sinkhorn Attention
2020
1
DeLighT
DeLighT: Deep and Light-weight Transformer
2020
1