Browse State-of-the-Art
Datasets
Methods
More
Newsletter
RC2022
About
Trends
Portals
Libraries
Sign In
Subscribe to the PwC Newsletter
×
Stay informed on the latest trending ML papers with code, research developments, libraries, methods, and datasets.
Read previous issues
Join the community
×
You need to
log in
to edit.
You can
create a new account
if you don't have one.
Edit Category
×
Description with markdown (optional):
Image
Synthesized Attention Mechanisms
Edit
Natural Language Processing
•
Attention Mechanisms
• 4 methods
Methods
Add a Method
Method
Year
Papers
Dense Synthesized Attention
Synthesizer: Rethinking Self-Attention in Transformer Models
2020
1
Random Synthesized Attention
Synthesizer: Rethinking Self-Attention in Transformer Models
2020
1
Factorized Random Synthesized Attention
Synthesizer: Rethinking Self-Attention in Transformer Models
2020
1
Factorized Dense Synthesized Attention
Synthesizer: Rethinking Self-Attention in Transformer Models
2020
1