Bilingual Lexicon Induction
34 papers with code • 0 benchmarks • 0 datasets
Translate words from one language to another.
Benchmarks
These leaderboards are used to track progress in Bilingual Lexicon Induction
Latest papers with no code
Bilingual Lexicon Induction for Low-Resource Languages using Graph Matching via Optimal Transport
Bilingual lexicons form a critical component of various NLP applications, including unsupervised and semisupervised machine translation and crosslingual information retrieval.
Investigating the Use of BERT Anchors for Bilingual Lexicon Induction with Minimal Supervision
This paper investigates the use of static anchors from transformer architectures for the task of Bilingual Lexicon Induction.
Prix-LM: Pretraining for Multilingual Knowledge Base Construction
To achieve this, it is crucial to represent multilingual knowledge in a shared/unified space.
Evaluating a Joint Training Approach for Learning Cross-lingual Embeddings with Sub-word Information without Parallel Corpora on Lower-resource Languages
Cross-lingual word embeddings provide a way for information to be transferred between languages.
Beyond Offline Mapping: Learning Cross-lingual Word Embeddings through Context Anchoring
Recent research on cross-lingual word embeddings has been dominated by unsupervised mapping approaches that align monolingual embeddings.
Improving Zero-Shot Cross-lingual Transfer for Multilingual Question Answering over Knowledge Graph
That is, we can only access training data in a high-resource language, while need to answer multilingual questions without any labeled data in target languages.
Word Embedding Transformation for Robust Unsupervised Bilingual Lexicon Induction
Embeddings of two languages are made to match with each other by rotating and scaling.
Bilingual Lexicon Induction via Unsupervised Bitext Construction and Word Alignment
Bilingual lexicons map words in one language to their translations in another, and are typically induced by learning linear projections to align monolingual word embedding spaces.
Beyond Offline Mapping: Learning Cross Lingual Word Embeddings through Context Anchoring
Recent research on cross-lingual word embeddings has been dominated by unsupervised mapping approaches that align monolingual embeddings.
Joint Training for Learning Cross-lingual Embeddings with Sub-word Information without Parallel Corpora
In this paper, we propose a novel method for learning cross-lingual word embeddings, that incorporates sub-word information during training, and is able to learn high-quality embeddings from modest amounts of monolingual data and a bilingual lexicon.