1 code implementation • 23 Oct 2023 • Negar Foroutan, Mohammadreza Banaei, Karl Aberer, Antoine Bosselut
We evaluate the cross-lingual reasoning abilities of MultiLMs in two schemes: (1) where the language of the context and the question remain the same in the new languages that are tested (i. e., the reasoning is still monolingual, but the model must transfer the learned reasoning ability across languages), and (2) where the language of the context and the question is different (which we term code-switched reasoning).
1 code implementation • 8 Feb 2023 • Mohammadreza Banaei, Klaudia Bałazy, Artur Kasymov, Rémi Lebret, Jacek Tabor, Karl Aberer
Recent transformer language models achieve outstanding results in many natural language processing (NLP) tasks.
1 code implementation • 25 May 2022 • Negar Foroutan, Mohammadreza Banaei, Remi Lebret, Antoine Bosselut, Karl Aberer
Multilingual pre-trained language models transfer remarkably well on cross-lingual downstream tasks.
1 code implementation • 30 Mar 2022 • Tim Poštuvan, Jiaxuan You, Mohammadreza Banaei, Rémi Lebret, Jure Leskovec
To mitigate these limitations, we propose Adaptive Grid Search (AdaGrid), which dynamically adjusts the edge message ratio during training.
1 code implementation • ACL (RepL4NLP) 2021 • Klaudia Bałazy, Mohammadreza Banaei, Rémi Lebret, Jacek Tabor, Karl Aberer
The adoption of Transformer-based models in natural language processing (NLP) has led to great success using a massive number of parameters.
no code implementations • 5 Jun 2020 • Mohammadreza Banaei, Rémi Lebret, Karl Aberer
This paper presents our approach for SwissText & KONVENS 2020 shared task 2, which is a multi-stage neural model for Swiss German (GSW) identification on Twitter.