GloVe Embeddings are a type of word embedding that encode the co-occurrence probability ratio between two words as vector differences. GloVe uses a weighted least squares objective $J$ that minimizes the difference between the dot product of the vectors of two words and the logarithm of their number of co-occurrences:
$$ J=\sum_{i, j=1}^{V}f\left(𝑋_{i j}\right)(w^{T}_{i}\tilde{w}_{j} + b_{i} + \tilde{b}_{j} - \log{𝑋}_{ij})^{2} $$
where $w_{i}$ and $b_{i}$ are the word vector and bias respectively of word $i$, $\tilde{w}_{j}$ and $b_{j}$ are the context word vector and bias respectively of word $j$, $X_{ij}$ is the number of times word $i$ occurs in the context of word $j$, and $f$ is a weighting function that assigns lower weights to rare and frequent co-occurrences.
Source: GloVe: Global Vectors for Word RepresentationPaper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Sentiment Analysis | 36 | 6.08% |
Sentence | 30 | 5.07% |
General Classification | 29 | 4.90% |
Text Classification | 25 | 4.22% |
Language Modelling | 21 | 3.55% |
Classification | 19 | 3.21% |
Machine Translation | 17 | 2.87% |
Question Answering | 16 | 2.70% |
Language Modeling | 15 | 2.53% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |