Learning Word Embeddings
23 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Learning Word Embeddings
Most implemented papers
One Embedder, Any Task: Instruction-Finetuned Text Embeddings
Our analysis suggests that INSTRUCTOR is robust to changes in instructions, and that instruction finetuning mitigates the challenge of training a single model on diverse datasets.
Speech2Vec: A Sequence-to-Sequence Framework for Learning Word Embeddings from Speech
In this paper, we propose a novel deep neural network architecture, Speech2Vec, for learning fixed-length vector representations of audio segments excised from a speech corpus, where the vectors contain semantic information pertaining to the underlying spoken words, and are close to other vectors in the embedding space if their corresponding underlying spoken words are semantically similar.
WordRank: Learning Word Embeddings via Robust Ranking
Then, based on this insight, we propose a novel framework WordRank that efficiently estimates word representations via robust ranking, in which the attention mechanism and robustness to noise are readily achieved via the DCG-like ranking losses.
Skip-gram word embeddings in hyperbolic space
Recent work has demonstrated that embeddings of tree-like graphs in hyperbolic space surpass their Euclidean counterparts in performance by a large margin.
Neural Graph Embedding Methods for Natural Language Processing
Knowledge graphs are structured representations of facts in a graph, where nodes represent entities and edges represent relationships between them.
The Mixing method: low-rank coordinate descent for semidefinite programming with diagonal constraints
In this paper, we propose a low-rank coordinate descent approach to structured semidefinite programming with diagonal constraints.
Dict2vec : Learning Word Embeddings using Lexical Dictionaries
Learning word embeddings on large unlabeled corpus has been shown to be successful in improving many natural language tasks.
Grammatical Error Detection Using Error- and Grammaticality-Specific Word Embeddings
In this study, we improve grammatical error detection by learning word embeddings that consider grammaticality and error patterns.