no code implementations • 21 May 2022 • Robin M. Schmidt, Telmo Pires, Stephan Peitz, Jonas Lööf
Non-autoregressive approaches aim to improve the inference speed of translation models by only requiring a single forward pass to generate the output sequence instead of iteratively producing each predicted token.
2 code implementations • Findings (ACL) 2022 • Orion Weller, Matthias Sperber, Telmo Pires, Hendra Setiawan, Christian Gollan, Dominic Telaar, Matthias Paulik
Code switching (CS) refers to the phenomenon of interchangeably using words and phrases from different languages.
3 code implementations • ACL 2019 • Telmo Pires, Eva Schlinger, Dan Garrette
In this paper, we show that Multilingual BERT (M-BERT), released by Devlin et al. (2018) as a single language model pre-trained from monolingual corpora in 104 languages, is surprisingly good at zero-shot cross-lingual model transfer, in which task-specific annotations in one language are used to fine-tune the model for evaluation in another language.