GloVe Embeddings are a type of word embedding that encode the cooccurrence probability ratio between two words as vector differences. GloVe uses a weighted least squares objective $J$ that minimizes the difference between the dot product of the vectors of two words and the logarithm of their number of cooccurrences:
$$ J=\sum_{i, j=1}^{V}f\left(𝑋_{i j}\right)(w^{T}_{i}\tilde{w}_{j} + b_{i} + \tilde{b}_{j}  \log{𝑋}_{ij})^{2} $$
where $w_{i}$ and $b_{i}$ are the word vector and bias respectively of word $i$, $\tilde{w}_{j}$ and $b_{j}$ are the context word vector and bias respectively of word $j$, $X_{ij}$ is the number of times word $i$ occurs in the context of word $j$, and $f$ is a weighting function that assigns lower weights to rare and frequent cooccurrences.
Source: GloVe: Global Vectors for Word RepresentationPaper  Code  Results  Date  Stars 

Task  Papers  Share 

Sentiment Analysis  33  7.08% 
General Classification  29  6.22% 
Text Classification  23  4.94% 
Classification  18  3.86% 
Language Modelling  18  3.86% 
Machine Translation  16  3.43% 
Question Answering  16  3.43% 
Word Similarity  14  3.00% 
Retrieval  13  2.79% 
Component  Type 


🤖 No Components Found  You can add them if they exist; e.g. Mask RCNN uses RoIAlign 