# Word Embeddings

970 papers with code • 0 benchmarks • 49 datasets

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

## Libraries

Use these libraries to find Word Embeddings models and implementations
2 papers
11,936
2 papers
2,987
2 papers
2,216
2 papers
1,792

# Enriching Word Vectors with Subword Information

A vector representation is associated to each character $n$-gram; words being represented as the sum of these representations.

50

# FastText.zip: Compressing text classification models

12 Dec 2016

We consider the problem of producing compact architectures for text classification, such that the full model fits in a limited amount of memory.

41

# Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

Many modern NLP systems rely on word embeddings, previously trained in an unsupervised manner on large corpora, as base features.

22

# Universal Sentence Encoder

29 Mar 2018

For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance.

22

# Word Translation Without Parallel Data

We finally describe experiments on the English-Esperanto low-resource language pair, on which there only exists a limited amount of parallel data, to show the potential impact of our method in fully unsupervised machine translation.

17

# Evaluation of sentence embeddings in downstream and linguistic probing tasks

16 Jun 2018

Despite the fast developmental pace of new sentence embedding methods, it is still challenging to find comprehensive evaluations of these different techniques.

14

# Named Entity Recognition with Bidirectional LSTM-CNNs

Named entity recognition is a challenging task that has traditionally required large amounts of knowledge in the form of feature engineering and lexicons to achieve high performance.

13

# Topic Modeling in Embedding Spaces

To this end, we develop the Embedded Topic Model (ETM), a generative model of documents that marries traditional topic models with word embeddings.

10

# Breaking the Softmax Bottleneck: A High-Rank RNN Language Model

We formulate language modeling as a matrix factorization problem, and show that the expressiveness of Softmax-based models (including the majority of neural language models) is limited by a Softmax bottleneck.

9

# Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings

Geometrically, gender bias is first shown to be captured by a direction in the word embedding.

8