no code implementations • CL (ACL) 2022 • Rochelle Choenni, Ekaterina Shutova
The results provide insight into their information-sharing mechanisms and suggest that these linguistic properties are encoded jointly across typologically similar languages in these models.
no code implementations • 14 Nov 2023 • Rochelle Choenni, Ekaterina Shutova, Dan Garrette
Recent work has proposed explicitly inducing language-wise modularity in multilingual LMs via sparse fine-tuning (SFT) on per-language subnetworks as a means of better guiding cross-lingual sharing.
no code implementations • 31 Oct 2023 • Claire E. Stevenson, Mathilde ter Veen, Rochelle Choenni, Han L. J. van der Maas, Ekaterina Shutova
We conclude that the LLMs we tested indeed tend to solve verbal analogies by association with C like children do.
1 code implementation • 28 Oct 2023 • Giulio Starace, Konstantinos Papakostas, Rochelle Choenni, Apostolos Panagiotopoulos, Matteo Rosati, Alina Leidinger, Ekaterina Shutova
Large Language Models (LLMs) exhibit impressive performance on a range of NLP tasks, due to the general-purpose linguistic knowledge acquired during pretraining.
no code implementations • 22 May 2023 • Rochelle Choenni, Dan Garrette, Ekaterina Shutova
We further study how different fine-tuning languages influence model performance on a given test language and find that they can both reinforce and complement the knowledge acquired from data of the test language itself.
no code implementations • 31 Oct 2022 • Rochelle Choenni, Dan Garrette, Ekaterina Shutova
Large multilingual language models typically share their parameters across all languages, which enables cross-lingual task transfer, but learning can also be hindered when training updates from different languages are in conflict.
no code implementations • EMNLP 2021 • Rochelle Choenni, Ekaterina Shutova, Robert van Rooij
In this paper, we investigate what types of stereotypical information are captured by pretrained language models.
no code implementations • 24 Oct 2020 • Rochelle Choenni, Ekaterina Shutova
Multilingual sentence encoders are widely used to transfer NLP models across languages.
no code implementations • 27 Sep 2020 • Rochelle Choenni, Ekaterina Shutova
Multilingual sentence encoders have seen much success in cross-lingual model transfer for downstream NLP tasks.
1 code implementation • WS 2019 • Samira Abnar, Lisa Beinborn, Rochelle Choenni, Willem Zuidema
In this paper, we define and apply representational stability analysis (ReStA), an intuitive way of analyzing neural language models.
1 code implementation • 4 Jun 2019 • Samira Abnar, Lisa Beinborn, Rochelle Choenni, Willem Zuidema
In this paper, we define and apply representational stability analysis (ReStA), an intuitive way of analyzing neural language models.
1 code implementation • CL (ACL) 2020 • Lisa Beinborn, Rochelle Choenni
We propose to conduct an adapted version of representational similarity analysis of a selected set of concepts in computational multilingual representations.
1 code implementation • 4 Apr 2019 • Lisa Beinborn, Samira Abnar, Rochelle Choenni
Language-brain encoding experiments evaluate the ability of language models to predict brain responses elicited by language stimuli.