LSH Attention, or Locality Sensitive Hashing Attention is a replacement for dot-product attention with one that uses locality-sensitive hashing, changing its complexity from O($L^2$) to O($L\log L$), where $L$ is the length of the sequence. LSH refers to a family of functions (known as LSH families) to hash data points into buckets so that data points near each other are located in the same buckets with high probability, while data points far from each other are likely to be in different buckets. It was proposed as part of the Reformer architecture.
Source: Reformer: The Efficient TransformerPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 3 | 7.14% |
Time Series Analysis | 2 | 4.76% |
Time Series Forecasting | 2 | 4.76% |
Sentence | 2 | 4.76% |
Image Generation | 2 | 4.76% |
Survey | 2 | 4.76% |
Deep Learning | 2 | 4.76% |
Reinforcement Learning (RL) | 2 | 4.76% |
Deblurring | 1 | 2.38% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |