no code implementations • WS 2019 • António V. Lopes, M. Amin Farajian, Gonçalo M. Correia, Jonay Trenous, André F. T. Martins
Analogously to dual-encoder architectures we develop a BERT-based encoder-decoder (BED) model in which a single pretrained BERT encoder receives both the source src and machine translation tgt strings.
no code implementations • EMNLP 2021 • Eva Hasler, Tobias Domhan, Jonay Trenous, Ke Tran, Bill Byrne, Felix Hieber
Building neural machine translation systems to perform well on a specific target domain is a well-studied problem.