no code implementations • EACL (LTEDI) 2021 • Christine Basta, Marta R. Costa-Jussa
This study sheds light on the effects of COVID-19 in the particular field of Computational Linguistics and Natural Language Processing within Artificial Intelligence.
no code implementations • ICON 2021 • Carlos Escolano, Graciela Ojeda, Christine Basta, Marta R. Costa-Jussa
Machine Translation is highly impacted by social biases present in data sets, indicating that it reflects and amplifies stereotypes.
no code implementations • WMT (EMNLP) 2021 • Carlos Escolano, Ioannis Tsiamas, Christine Basta, Javier Ferrando, Marta R. Costa-Jussa, José A. R. Fonollosa
We fine-tune mBART50 using the filtered data, and additionally, we train a Transformer model on the same data from scratch.
no code implementations • 3 May 2021 • Christine Basta, Marta R. Costa-jussà
Gender, race and social biases have recently been detected as evident examples of unfairness in applications of Natural Language Processing.
no code implementations • 24 Dec 2020 • Marta R. Costa-jussà, Carlos Escolano, Christine Basta, Javier Ferrando, Roser Batlle, Ksenia Kharitonova
Multilingual Neural Machine Translation architectures mainly differ in the amount of sharing modules and parameters among languages.
no code implementations • LREC 2022 • Marta R. Costa-jussà, Christine Basta, Gerard I. Gállego
WinoST is the speech version of WinoMT which is a MT challenge set and both follow an evaluation protocol to measure gender accuracy.
no code implementations • WS 2020 • Christine Basta, Marta R. Costa-juss{\`a}, Jos{\'e} A. R. Fonollosa
Gender bias negatively impacts many natural language processing applications, including machine translation (MT).
no code implementations • WS 2019 • Noe Casas, Jos{\'e} A. R. Fonollosa, Carlos Escolano, Christine Basta, Marta R. Costa-juss{\`a}
In this article, we describe the TALP-UPC research group participation in the WMT19 news translation shared task for Kazakh-English.
no code implementations • WS 2019 • Christine Basta, Marta R. Costa-jussà, Noe Casas
Gender bias is highly impacting natural language processing applications.