Cross-Lingual Transfer

289 papers with code • 1 benchmarks • 16 datasets

Cross-lingual transfer refers to transfer learning using data and models available for one language for which ample such resources are available (e.g., English) to solve tasks in another, commonly more low-resource, language.

Libraries

Use these libraries to find Cross-Lingual Transfer models and implementations
2 papers
396
2 papers
128

Latest papers with no code

Neuron Specialization: Leveraging intrinsic task modularity for multilingual machine translation

no code yet • 17 Apr 2024

Training a unified multilingual model promotes knowledge transfer but inevitably introduces negative interference.

Adapting Mental Health Prediction Tasks for Cross-lingual Learning via Meta-Training and In-context Learning with Large Language Model

no code yet • 13 Apr 2024

The results show that our meta-trained model performs significantly better than standard fine-tuning methods, outperforming the baseline fine-tuning in macro F1 score with 18\% and 0. 8\% over XLM-R and mBERT.

Event Extraction in Basque: Typologically motivated Cross-Lingual Transfer-Learning Analysis

no code yet • 9 Apr 2024

To perform the experiments we introduce EusIE, an event extraction dataset for Basque, which follows the Multilingual Event Extraction dataset (MEE).

MaiNLP at SemEval-2024 Task 1: Analyzing Source Language Selection in Cross-Lingual Textual Relatedness

no code yet • 3 Apr 2024

This paper presents our system developed for the SemEval-2024 Task 1: Semantic Textual Relatedness (STR), on Track C: Cross-lingual.

Bailong: Bilingual Transfer Learning based on QLoRA and Zip-tie Embedding

no code yet • 1 Apr 2024

However, the majority of existing open-source LLMs are pre-trained primarily on English data and little part of other languages.

A Systematic Analysis of Subwords and Cross-Lingual Transfer in Multilingual Translation

no code yet • 29 Mar 2024

Multilingual modelling can improve machine translation for low-resource languages, partly through shared subword representations.

Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?

no code yet • 25 Mar 2024

We furthermore provide evidence through similarity measures and investigation of parameters that this lack of positive influence is due to output separability -- which we argue is of use for machine translation but detrimental elsewhere.

Towards Knowledge-Grounded Natural Language Understanding and Generation

no code yet • 22 Mar 2024

This thesis investigates how natural language understanding and generation with transformer models can benefit from grounding the models with knowledge representations and addresses the following key research questions: (i) Can knowledge of entities extend its benefits beyond entity-centric tasks, such as entity linking?

Cross-Lingual Transfer for Natural Language Inference via Multilingual Prompt Translator

no code yet • 19 Mar 2024

To efficiently transfer soft prompt, we propose a novel framework, Multilingual Prompt Translator (MPT), where a multilingual prompt translator is introduced to properly process crucial knowledge embedded in prompt by changing language knowledge while retaining task knowledge.

Few-Shot Cross-Lingual Transfer for Prompting Large Language Models in Low-Resource Languages

no code yet • 9 Mar 2024

We find that the results are task and language dependent but find that the prompting method is the best on average across all tasks and languages.