no code implementations • ACL 2021 • Ra{\'u}l V{\'a}zquez, Hande Celikkanat, Mathias Creutz, J{\"o}rg Tiedemann
Various studies show that pretrained language models such as BERT cannot straightforwardly replace encoders in neural machine translation despite their enormous success in other tasks.
no code implementations • WS 2020 • Ra{\'u}l V{\'a}zquez, Mikko Aulamo, Umut Sulubacak, J{\"o}rg Tiedemann
This paper describes the University of Helsinki Language Technology group{'}s participation in the IWSLT 2020 offline speech translation task, addressing the translation of English audio into German text.
no code implementations • CL 2020 • Ra{\'u}l V{\'a}zquez, Aless Raganato, ro, Mathias Creutz, J{\"o}rg Tiedemann
In particular, we show that larger intermediate layers not only improve translation quality, especially for long sentences, but also push the accuracy of trainable classification tasks.
no code implementations • WS 2019 • Aless Raganato, ro, Ra{\'u}l V{\'a}zquez, Mathias Creutz, J{\"o}rg Tiedemann
In this paper, we explore a multilingual translation model with a cross-lingually shared layer that can be used as fixed-size sentence representation in different downstream tasks.
no code implementations • WS 2019 • Ra{\'u}l V{\'a}zquez, Umut Sulubacak, J{\"o}rg Tiedemann
This paper describes the University of Helsinki Language Technology group{'}s participation in the WMT 2019 parallel corpus filtering task.
no code implementations • WS 2019 • Yves Scherrer, Ra{\'u}l V{\'a}zquez, Sami Virpioja
This paper describes the University of Helsinki Language Technology group{'}s participation in the WMT 2019 similar language translation task.