Bilingual Lexicon Induction

34 papers with code • 0 benchmarks • 0 datasets

Translate words from one language to another.

Latest papers with no code

Bilingual Lexicon Induction for Low-Resource Languages using Graph Matching via Optimal Transport

no code yet • ACL ARR January 2022

Bilingual lexicons form a critical component of various NLP applications, including unsupervised and semisupervised machine translation and crosslingual information retrieval.

Investigating the Use of BERT Anchors for Bilingual Lexicon Induction with Minimal Supervision

no code yet • ACL ARR November 2021

This paper investigates the use of static anchors from transformer architectures for the task of Bilingual Lexicon Induction.

Prix-LM: Pretraining for Multilingual Knowledge Base Construction

no code yet • ACL ARR November 2021

To achieve this, it is crucial to represent multilingual knowledge in a shared/unified space.

Beyond Offline Mapping: Learning Cross-lingual Word Embeddings through Context Anchoring

no code yet • ACL 2021

Recent research on cross-lingual word embeddings has been dominated by unsupervised mapping approaches that align monolingual embeddings.

Improving Zero-Shot Cross-lingual Transfer for Multilingual Question Answering over Knowledge Graph

no code yet • NAACL 2021

That is, we can only access training data in a high-resource language, while need to answer multilingual questions without any labeled data in target languages.

Word Embedding Transformation for Robust Unsupervised Bilingual Lexicon Induction

no code yet • 26 May 2021

Embeddings of two languages are made to match with each other by rotating and scaling.

Bilingual Lexicon Induction via Unsupervised Bitext Construction and Word Alignment

no code yet • ACL 2021

Bilingual lexicons map words in one language to their translations in another, and are typically induced by learning linear projections to align monolingual word embedding spaces.

Beyond Offline Mapping: Learning Cross Lingual Word Embeddings through Context Anchoring

no code yet • 31 Dec 2020

Recent research on cross-lingual word embeddings has been dominated by unsupervised mapping approaches that align monolingual embeddings.

Joint Training for Learning Cross-lingual Embeddings with Sub-word Information without Parallel Corpora

no code yet • Joint Conference on Lexical and Computational Semantics 2020

In this paper, we propose a novel method for learning cross-lingual word embeddings, that incorporates sub-word information during training, and is able to learn high-quality embeddings from modest amounts of monolingual data and a bilingual lexicon.