Search Results for author: Ra{\'u}l V{\'a}zquez

Found 6 papers, 0 papers with code

On the differences between BERT and MT encoder spaces and how to address them in translation tasks

no code implementations ACL 2021 Ra{\'u}l V{\'a}zquez, Hande Celikkanat, Mathias Creutz, J{\"o}rg Tiedemann

Various studies show that pretrained language models such as BERT cannot straightforwardly replace encoders in neural machine translation despite their enormous success in other tasks.

Machine Translation NMT +1

The University of Helsinki Submission to the IWSLT2020 Offline SpeechTranslation Task

no code implementations WS 2020 Ra{\'u}l V{\'a}zquez, Mikko Aulamo, Umut Sulubacak, J{\"o}rg Tiedemann

This paper describes the University of Helsinki Language Technology group{'}s participation in the IWSLT 2020 offline speech translation task, addressing the translation of English audio into German text.

Transfer Learning Translation

A Systematic Study of Inner-Attention-Based Sentence Representations in Multilingual Neural Machine Translation

no code implementations CL 2020 Ra{\'u}l V{\'a}zquez, Aless Raganato, ro, Mathias Creutz, J{\"o}rg Tiedemann

In particular, we show that larger intermediate layers not only improve translation quality, especially for long sentences, but also push the accuracy of trainable classification tasks.

Machine Translation Sentence +2

The University of Helsinki Submission to the WMT19 Parallel Corpus Filtering Task

no code implementations WS 2019 Ra{\'u}l V{\'a}zquez, Umut Sulubacak, J{\"o}rg Tiedemann

This paper describes the University of Helsinki Language Technology group{'}s participation in the WMT 2019 parallel corpus filtering task.

General Classification Sentence

The University of Helsinki Submissions to the WMT19 Similar Language Translation Task

no code implementations WS 2019 Yves Scherrer, Ra{\'u}l V{\'a}zquez, Sami Virpioja

This paper describes the University of Helsinki Language Technology group{'}s participation in the WMT 2019 similar language translation task.

Machine Translation Segmentation +1

An Evaluation of Language-Agnostic Inner-Attention-Based Representations in Machine Translation

no code implementations WS 2019 Aless Raganato, ro, Ra{\'u}l V{\'a}zquez, Mathias Creutz, J{\"o}rg Tiedemann

In this paper, we explore a multilingual translation model with a cross-lingually shared layer that can be used as fixed-size sentence representation in different downstream tasks.

Machine Translation Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.