About

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Benchmarks

No evaluation results yet. Help compare methods by submit evaluation metrics.

Subtasks

Datasets

Latest papers without code

Evaluation Of Word Embeddings From Large-Scale French Web Content

5 May 2021

Distributed word representations are popularly used in many tasks in natural language processing, adding that pre-trained word vectors on huge text corpus achieved high performance in many different NLP tasks.

WORD EMBEDDINGS

Large-scale Taxonomy Induction Using Entity and Word Embeddings

4 May 2021

Taxonomies are an important ingredient of knowledge organization, and serve as a backbone for more sophisticated knowledge representations in intelligent systems, such as formal ontologies.

WORD EMBEDDINGS

Discourse Relation Embeddings: Representing the Relations between Discourse Segments in Social Media

4 May 2021

Discourse relations are typically modeled as a discrete class that characterizes the relation between segments of text (e. g. causal explanations, expansions).

RELATION CLASSIFICATION WORD EMBEDDINGS

Impact of Gender Debiased Word Embeddings in Language Modeling

3 May 2021

Gender, race and social biases have recently been detected as evident examples of unfairness in applications of Natural Language Processing.

FAIRNESS LANGUAGE MODELLING WORD EMBEDDINGS

Mitigating Political Bias in Language Models Through Reinforced Calibration

30 Apr 2021

Current large-scale language models can be politically biased as a result of the data they are trained on, potentially causing serious problems when they are deployed in real-world settings.

WORD EMBEDDINGS

Cross-lingual hate speech detection based on multilingual domain-specific word embeddings

30 Apr 2021

Our proposal constitutes, to the best of our knowledge, the first attempt for constructing multilingual specific-task representations.

HATE SPEECH DETECTION TRANSFER LEARNING WORD EMBEDDINGS

A Short Survey of Pre-trained Language Models for Conversational AI-A NewAge in NLP

22 Apr 2021

Building a dialogue system that can communicate naturally with humans is a challenging yet interesting problem of agent-based computing.

DECISION MAKING WORD EMBEDDINGS

Deep Clustering with Measure Propagation

18 Apr 2021

For example, deep embedded clustering (DEC) has greatly improved the unsupervised clustering performance, by using stacked autoencoders for representation learning.

DEEP CLUSTERING SHORT TEXT CLUSTERING WORD EMBEDDINGS

Group-Sparse Matrix Factorization for Transfer Learning of Word Embeddings

18 Apr 2021

We propose a novel group-sparse penalty that exploits this sparsity to perform transfer learning when there is very little text data available in the target domain -- e. g., a single article of text.

GENERALIZATION BOUNDS LEARNING WORD EMBEDDINGS TRANSFER LEARNING

Improving Neural Machine Translation with Compact Word Embedding Tables

18 Apr 2021

Embedding matrices are key components in neural natural language processing (NLP) models that are responsible to provide numerical representations of input tokens.\footnote{In this paper words and subwords are referred to as \textit{tokens} and the term \textit{embedding} only refers to embeddings of inputs.}

MACHINE TRANSLATION WORD EMBEDDINGS