Browse > Natural Language Processing > Cross-Lingual > Cross-Lingual Transfer

Cross-Lingual Transfer

6 papers with code · Natural Language Processing
Subtask of Cross-Lingual

State-of-the-art leaderboards

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual Transfer and Beyond

26 Dec 2018facebookresearch/LASER

We introduce an architecture to learn joint multilingual sentence representations for 93 languages, belonging to more than 30 different language families and written in 28 different scripts. Finally, we introduce a new test set of aligned sentences in 122 languages based on the Tatoeba corpus, and show that our sentence embeddings obtain strong results in multilingual similarity search even for low-resource languages.

CROSS-LINGUAL BITEXT MINING CROSS-LINGUAL DOCUMENT CLASSIFICATION CROSS-LINGUAL NATURAL LANGUAGE INFERENCE CROSS-LINGUAL TRANSFER DOCUMENT CLASSIFICATION JOINT MULTILINGUAL SENTENCE REPRESENTATIONS PARALLEL CORPUS MINING

Adversarial Propagation and Zero-Shot Cross-Lingual Transfer of Word Vector Specialization

EMNLP 2018 cambridgeltl/adversarial-postspec

While post-processing specialization methods are applicable to arbitrary distributional vectors, they are limited to updating only the vectors of words occurring in external lexicons (i.e., seen words), leaving the vectors of all other words unchanged. Our adversarial post-specialization method propagates the external lexical knowledge to the full distributional space.

CROSS-LINGUAL TRANSFER LEXICAL SIMPLIFICATION

Cross-lingual Argumentation Mining: Machine Translation (and a bit of Projection) is All You Need!

COLING 2018 UKPLab/coling2018-xling_argument_mining

Argumentation mining (AM) requires the identification of complex discourse structures and has lately been applied with success monolingually. In this work, we show that the existing resources are, however, not adequate for assessing cross-lingual AM, due to their heterogeneity or lack of complexity.

CROSS-LINGUAL TRANSFER MACHINE TRANSLATION WORD EMBEDDINGS

Predicting Concreteness and Imageability of Words Within and Across Languages via Word Embeddings

9 Jul 2018clarinsi/megahr-crossling

In this paper we investigate the predictability of these two concepts via supervised learning, using word embeddings as explanatory variables. We show that the notions of concreteness and imageability are highly predictable both within and across languages, with a moderate loss of up to 20% in correlation when predicting across languages.

CROSS-LINGUAL TRANSFER WORD EMBEDDINGS