A Gated Linear Unit, or GLU computes:
$$ \text{GLU}\left(a, b\right) = a\otimes \sigma\left(b\right) $$
It is used in natural language processing architectures, for example the Gated CNN, because here $b$ is the gate that control what information from $a$ is passed up to the following layer. Intuitively, for a language modeling task, the gating mechanism allows selection of words or features that are important for predicting the next word. The GLU also has non-linear capabilities, but has a linear path for the gradient so diminishes the vanishing gradient problem.
Source: Language Modeling with Gated Convolutional NetworksPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 92 | 8.07% |
Language Modeling | 74 | 6.49% |
Question Answering | 47 | 4.12% |
Decoder | 42 | 3.68% |
Sentence | 39 | 3.42% |
Text Generation | 33 | 2.89% |
Translation | 27 | 2.37% |
Retrieval | 27 | 2.37% |
Machine Translation | 24 | 2.11% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |