Search Results for author: Telmo Pires

Found 3 papers, 2 papers with code

Non-Autoregressive Neural Machine Translation: A Call for Clarity

no code implementations21 May 2022 Robin M. Schmidt, Telmo Pires, Stephan Peitz, Jonas Lööf

Non-autoregressive approaches aim to improve the inference speed of translation models by only requiring a single forward pass to generate the output sequence instead of iteratively producing each predicted token.

Machine Translation Translation

How multilingual is Multilingual BERT?

3 code implementations ACL 2019 Telmo Pires, Eva Schlinger, Dan Garrette

In this paper, we show that Multilingual BERT (M-BERT), released by Devlin et al. (2018) as a single language model pre-trained from monolingual corpora in 104 languages, is surprisingly good at zero-shot cross-lingual model transfer, in which task-specific annotations in one language are used to fine-tune the model for evaluation in another language.

Language Modelling Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.