1 code implementation • WMT (EMNLP) 2020 • M. Amin Farajian, António V. Lopes, André F. T. Martins, Sameen Maruf, Gholamreza Haffari
We report the results of the first edition of the WMT shared task on chat translation.
no code implementations • 4 Sep 2023 • Telmo Pessoa Pires, António V. Lopes, Yannick Assogba, Hendra Setiawan
The Transformer architecture has two main non-embedding components: Attention and the Feed Forward Network (FFN).
no code implementations • WS 2019 • Fabio Kepler, Jonay Trénous, Marcos Treviso, Miguel Vera, António Góis, M. Amin Farajian, António V. Lopes, André F. T. Martins
We present the contribution of the Unbabel team to the WMT 2019 Shared Task on Quality Estimation.
no code implementations • WS 2019 • António V. Lopes, M. Amin Farajian, Gonçalo M. Correia, Jonay Trenous, André F. T. Martins
Analogously to dual-encoder architectures we develop a BERT-based encoder-decoder (BED) model in which a single pretrained BERT encoder receives both the source src and machine translation tgt strings.