no code implementations • WMT (EMNLP) 2021 • Kenneth Heafield, Qianqian Zhu, Roman Grundkiewicz
The machine translation efficiency task challenges participants to make their systems faster and smaller with minimal impact on translation quality.
no code implementations • EMNLP 2020 • Maximiliana Behnke, Kenneth Heafield
The attention mechanism is the crucial component of the transformer architecture.
no code implementations • COLING 2022 • Elsbeth Turcan, David Wan, Faisal Ladhak, Petra Galuscakova, Sukanta Sen, Svetlana Tchistiakova, Weijia Xu, Marine Carpuat, Kenneth Heafield, Douglas Oard, Kathleen McKeown
Query-focused summaries of foreign-language, retrieved documents can help a user understand whether a document is actually relevant to the query term.
no code implementations • LREC 2022 • Kenneth Heafield, Elaine Farrow, Jelmer Van der Linde, Gema Ramírez-Sánchez, Dion Wiggins
We present the EuroPat corpus of patent-specific parallel data for 6 official European languages paired with English: German, Spanish, French, Croatian, Norwegian, and Polish.
no code implementations • WMT (EMNLP) 2020 • Ulrich Germann, Roman Grundkiewicz, Martin Popel, Radina Dobreva, Nikolay Bogoychev, Kenneth Heafield
We describe the joint submission of the University of Edinburgh and Charles University, Prague, to the Czech/English track in the WMT 2020 Shared Task on News Translation.
no code implementations • WMT (EMNLP) 2021 • Farhad Akhbardeh, Arkady Arkhangorodsky, Magdalena Biesialska, Ondřej Bojar, Rajen Chatterjee, Vishrav Chaudhary, Marta R. Costa-Jussa, Cristina España-Bonet, Angela Fan, Christian Federmann, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Barry Haddow, Leonie Harter, Kenneth Heafield, Christopher Homan, Matthias Huck, Kwabena Amponsah-Kaakyire, Jungo Kasai, Daniel Khashabi, Kevin Knight, Tom Kocmi, Philipp Koehn, Nicholas Lourie, Christof Monz, Makoto Morishita, Masaaki Nagata, Ajay Nagesh, Toshiaki Nakazawa, Matteo Negri, Santanu Pal, Allahsera Auguste Tapo, Marco Turchi, Valentin Vydrin, Marcos Zampieri
This paper presents the results of the newstranslation task, the multilingual low-resourcetranslation for Indo-European languages, thetriangular translation task, and the automaticpost-editing task organised as part of the Con-ference on Machine Translation (WMT) 2021. In the news task, participants were asked tobuild machine translation systems for any of10 language pairs, to be evaluated on test setsconsisting mainly of news stories.
no code implementations • NAACL 2022 • Proyag Pal, Kenneth Heafield
This paper describes a method to quantify the amount of information H(t|s) added by the target sentence t that is not present in the source s in a neural machine translation system.
1 code implementation • WMT (EMNLP) 2021 • Pinzhen Chen, Jindřich Helcl, Ulrich Germann, Laurie Burchell, Nikolay Bogoychev, Antonio Valerio Miceli Barone, Jonas Waldendorf, Alexandra Birch, Kenneth Heafield
This paper presents the University of Edinburgh’s constrained submissions of English-German and English-Hausa systems to the WMT 2021 shared task on news translation.
no code implementations • WMT (EMNLP) 2021 • Maximiliana Behnke, Kenneth Heafield
In the WMT 2021 Efficiency Task, our pruned and quantised models are 1. 9–2. 7x faster at the cost 0. 9–1. 7 BLEU in comparison to the unoptimised baselines.
no code implementations • WMT (EMNLP) 2021 • Maximiliana Behnke, Nikolay Bogoychev, Alham Fikri Aji, Kenneth Heafield, Graeme Nail, Qianqian Zhu, Svetlana Tchistiakova, Jelmer Van der Linde, Pinzhen Chen, Sidharth Kashyap, Roman Grundkiewicz
We participated in all tracks of the WMT 2021 efficient machine translation task: single-core CPU, multi-core CPU, and GPU hardware with throughput and latency conditions.
no code implementations • 6 Jun 2023 • Pinzhen Chen, Zhicheng Guo, Barry Haddow, Kenneth Heafield
In this paper, we propose iterative translation refinement to leverage the power of large language models for more natural translation and post-editing.
1 code implementation • 23 May 2023 • Laurie Burchell, Alexandra Birch, Nikolay Bogoychev, Kenneth Heafield
We achieve this by training on a curated dataset of monolingual data, the reliability of which we ensure by auditing a sample from each source and each language manually.
no code implementations • 31 Aug 2022 • Marcos Treviso, Ji-Ung Lee, Tianchu Ji, Betty van Aken, Qingqing Cao, Manuel R. Ciosici, Michael Hassid, Kenneth Heafield, Sara Hooker, Colin Raffel, Pedro H. Martins, André F. T. Martins, Jessica Zosa Forde, Peter Milder, Edwin Simpson, Noam Slonim, Jesse Dodge, Emma Strubell, Niranjan Balasubramanian, Leon Derczynski, Iryna Gurevych, Roy Schwartz
Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve performance means that resource consumption also grows.
4 code implementations • Meta AI 2022 • NLLB team, Marta R. Costa-jussà, James Cross, Onur Çelebi, Maha Elbayad, Kenneth Heafield, Kevin Heffernan, Elahe Kalbassi, Janice Lam, Daniel Licht, Jean Maillard, Anna Sun, Skyler Wang, Guillaume Wenzek, Al Youngblood, Bapi Akula, Loic Barrault, Gabriel Mejia Gonzalez, Prangthip Hansanti, John Hoffman, Semarley Jarrett, Kaushik Ram Sadagopan, Dirk Rowe, Shannon Spruit, Chau Tran, Pierre Andrews, Necip Fazil Ayan, Shruti Bhosale, Sergey Edunov, Angela Fan, Cynthia Gao, Vedanuj Goswami, Francisco Guzmán, Philipp Koehn, Alexandre Mourachko, Christophe Ropers, Safiyyah Saleem, Holger Schwenk, Jeff Wang
Driven by the goal of eradicating language barriers on a global scale, machine translation has solidified itself as a key focus of artificial intelligence research today.
Ranked #1 on
Machine Translation
on IWSLT2015 English-Vietnamese
(SacreBLEU metric)
1 code implementation • DeepLo 2022 • Laurie Burchell, Alexandra Birch, Kenneth Heafield
We also find evidence that lexical diversity is more important than syntactic for back translation performance.
no code implementations • EMNLP (ACL) 2021 • Nikolay Bogoychev, Jelmer Van der Linde, Kenneth Heafield
Every day, millions of people sacrifice their privacy and browsing habits in exchange for online machine translation.
no code implementations • ACL 2021 • Adithya Renduchintala, Denise Diaz, Kenneth Heafield, Xian Li, Mona Diab
Is bias amplified when neural machine translation (NMT) models are optimized for speed and evaluated on generic test sets using BLEU?
no code implementations • 31 Dec 2020 • Alham Fikri Aji, Kenneth Heafield
This paper explores augmenting monolingual data for knowledge distillation in neural machine translation.
1 code implementation • 12 Aug 2020 • Pin-zhen Chen, Kenneth Heafield
Chinese word segmentation has entered the deep learning era which greatly reduces the hassle of feature engineering.
Chinese Word Segmentation
Low-Resource Neural Machine Translation
+1
1 code implementation • AMTA 2020 • Tobias Domhan, Michael Denkowski, David Vilar, Xing Niu, Felix Hieber, Kenneth Heafield
We present Sockeye 2, a modernized and streamlined version of the Sockeye neural machine translation (NMT) toolkit.
1 code implementation • ACL 2020 • Pin-zhen Chen, Nikolay Bogoychev, Kenneth Heafield, Faheem Kirefu
We present a novel method to extract parallel sentences from two monolingual corpora, using neural machine translation.
no code implementations • WS 2020 • Alham Fikri Aji, Kenneth Heafield
We empirically show that NMT models based on Transformer or RNN architecture can be compressed up to 4-bit precision without any noticeable quality degradation.
2 code implementations • ACL 2020 • Marta Ba{\~n}{\'o}n, Pin-zhen Chen, Barry Haddow, Kenneth Heafield, Hieu Hoang, Miquel Espl{\`a}-Gomis, Mikel L. Forcada, Amir Kamran, Faheem Kirefu, Philipp Koehn, Sergio Ortiz Rojas, Leopoldo Pla Sempere, Gema Ram{\'\i}rez-S{\'a}nchez, Elsa Sarr{\'\i}as, Marek Strelec, Brian Thompson, William Waites, Dion Wiggins, Jaume Zaragoza
We report on methods to create the largest publicly available parallel corpora by crawling the web, using open source software.
no code implementations • WS 2020 • Kenneth Heafield, Hiroaki Hayashi, Yusuke Oda, Ioannis Konstas, Andrew Finch, Graham Neubig, Xi-An Li, Alex Birch, ra
We describe the finding of the Fourth Workshop on Neural Generation and Translation, held in concert with the annual conference of the Association for Computational Linguistics (ACL 2020).
no code implementations • ACL 2020 • Alham Fikri Aji, Nikolay Bogoychev, Kenneth Heafield, Rico Sennrich
Transfer learning improves quality for low-resource machine translation, but it is unclear what exactly it transfers.
no code implementations • WS 2020 • Nikolay Bogoychev, Roman Grundkiewicz, Alham Fikri Aji, Maximiliana Behnke, Kenneth Heafield, Sidharth Kashyap, Emmanouil-Ioannis Farsarakis, Mateusz Chudyk
We participated in all tracks of the Workshop on Neural Generation and Translation 2020 Efficiency Shared Task: single-core CPU, multi-core CPU, and GPU.
no code implementations • IJCNLP 2019 • Alham Fikri Aji, Kenneth Heafield, Nikolay Bogoychev
One way to reduce network traffic in multi-node data-parallel stochastic gradient descent is to only exchange the largest gradients.
no code implementations • WS 2019 • Anna Currey, Kenneth Heafield
An extension to zero-shot NMT is zero-resource NMT, which generates pseudo-parallel corpora using a zero-shot system and further trains the zero-shot system on that data.
no code implementations • WS 2019 • Young Jin Kim, Marcin Junczys-Dowmunt, Hany Hassan, Alham Fikri Aji, Kenneth Heafield, Roman Grundkiewicz, Nikolay Bogoychev
Taking our dominating submissions to the previous edition of the shared task as a starting point, we develop improved teacher-student training via multi-agent dual-learning and noisy backward-forward translation for Transformer-based student models.
no code implementations • 13 Sep 2019 • Alham Fikri Aji, Kenneth Heafield
We empirically show that NMT models based on Transformer or RNN architecture can be compressed up to 4-bit precision without any noticeable quality degradation.
no code implementations • WS 2019 • Anna Currey, Kenneth Heafield
Transformer-based neural machine translation (NMT) has recently achieved state-of-the-art performance on many machine translation tasks.
1 code implementation • WS 2019 • Roman Grundkiewicz, Marcin Junczys-Dowmunt, Kenneth Heafield
Considerable effort has been made to address the data sparsity problem in neural grammatical error correction.
Ranked #9 on
Grammatical Error Correction
on BEA-2019 (test)
no code implementations • WS 2019 • Alham Fikri Aji, Kenneth Heafield
Asynchronous stochastic gradient descent (SGD) is attractive from a speed perspective because workers do not wait for synchronization.
no code implementations • WS 2018 • Barry Haddow, Nikolay Bogoychev, Denis Emelin, Ulrich Germann, Roman Grundkiewicz, Kenneth Heafield, Antonio Valerio Miceli Barone, Rico Sennrich
The University of Edinburgh made submissions to all 14 language pairs in the news translation task, with strong performances in most pairs.
no code implementations • WS 2018 • Philipp Koehn, Huda Khayrallah, Kenneth Heafield, Mikel L. Forcada
We posed the shared task of assigning sentence-level quality scores for a very noisy corpus of sentence pairs crawled from the web, with the goal of sub-selecting 1{\%} and 10{\%} of high-quality data to be used to train machine translation systems.
no code implementations • EMNLP 2018 • Anna Currey, Kenneth Heafield
We introduce a novel multi-source technique for incorporating source syntax into neural machine translation using linearized parses.
no code implementations • EMNLP 2018 • Nikolay Bogoychev, Marcin Junczys-Dowmunt, Kenneth Heafield, Alham Fikri Aji
In order to extract the best possible performance from asynchronous stochastic gradient descent one must increase the mini-batch size and scale the learning rate accordingly.
1 code implementation • WS 2018 • Roman Grundkiewicz, Kenneth Heafield
Transliterating named entities from one language into another can be approached as neural machine translation (NMT) problem, for which we use deep attentional RNN encoder-decoder models.
no code implementations • WS 2018 • Anna Currey, Kenneth Heafield
Incorporating source syntactic information into neural machine translation (NMT) has recently proven successful (Eriguchi et al., 2016; Luong et al., 2016).
Low-Resource Neural Machine Translation
Natural Language Inference
+3
no code implementations • WS 2018 • Marcin Junczys-Dowmunt, Kenneth Heafield, Hieu Hoang, Roman Grundkiewicz, Anthony Aue
This paper describes the submissions of the "Marian" team to the WNMT 2018 shared task.
no code implementations • WS 2018 • Hieu Hoang, Tomasz Dwojak, Rihards Krislauks, Daniel Torregrosa, Kenneth Heafield
This paper describes the submissions to the efficiency track for GPUs at the Workshop for Neural Machine Translation and Generation by members of the University of Edinburgh, Adam Mickiewicz University, Tilde and University of Alicante.
no code implementations • 5 May 2018 • Robert Lim, Kenneth Heafield, Hieu Hoang, Mark Briers, Allen Malony
Neural machine translation (NMT) has been accelerated by deep learning neural networks over statistical-based approaches, due to the plethora and programmability of commodity heterogeneous computing architectures such as FPGAs and GPUs and the massive amount of training corpuses generated from news outlets, government agencies and social media.
1 code implementation • NAACL 2018 • Marcin Junczys-Dowmunt, Roman Grundkiewicz, Shubha Guha, Kenneth Heafield
Previously, neural methods in grammatical error correction (GEC) did not reach state-of-the-art results compared to phrase-based statistical machine translation (SMT) baselines.
Ranked #1 on
Grammatical Error Correction
on _Restricted_
2 code implementations • ACL 2018 • Marcin Junczys-Dowmunt, Roman Grundkiewicz, Tomasz Dwojak, Hieu Hoang, Kenneth Heafield, Tom Neckermann, Frank Seide, Ulrich Germann, Alham Fikri Aji, Nikolay Bogoychev, André F. T. Martins, Alexandra Birch
We present Marian, an efficient and self-contained Neural Machine Translation framework with an integrated automatic differentiation engine based on dynamic computation graphs.
no code implementations • WS 2017 • Rico Sennrich, Alexandra Birch, Anna Currey, Ulrich Germann, Barry Haddow, Kenneth Heafield, Antonio Valerio Miceli Barone, Philip Williams
This paper describes the University of Edinburgh's submissions to the WMT17 shared news translation and biomedical translation tasks.
no code implementations • EMNLP 2017 • Alham Fikri Aji, Kenneth Heafield
Most configurations work on MNIST, whereas different configurations reduce convergence rate on the more complex translation task.
no code implementations • LREC 2014 • Christian Buck, Kenneth Heafield, Bas van Ooyen
We contribute 5-gram counts and language models trained on the Common Crawl corpus, a collection over 9 billion web pages.