1 code implementation • EMNLP 2018 • Gongbo Tang, Mathias Müller, Annette Rios, Rico Sennrich
Recently, non-recurrent architectures (convolutional, self-attentional) have outperformed RNNs in neural machine translation.
1 code implementation • WS 2017 • Robert Östling, Yves Scherrer, Jörg Tiedemann, Gongbo Tang, Tommi Nieminen
We also discuss our submissions for English--Latvian, English--Chinese and Chinese--English.
1 code implementation • COLING 2018 • Gongbo Tang, Fabienne Cap, Eva Pettersson, Joakim Nivre
In this paper, we apply different NMT models to the problem of historical spelling normalization for five languages: English, German, Hungarian, Icelandic, and Swedish.
no code implementations • WS 2018 • Gongbo Tang, Rico Sennrich, Joakim Nivre
Recent work has shown that the encoder-decoder attention mechanisms in neural machine translation (NMT) are different from the word alignment in statistical machine translation.
no code implementations • RANLP 2019 • Gongbo Tang, Rico Sennrich, Joakim Nivre
In this paper, we try to understand neural machine translation (NMT) via simplifying NMT architectures and training encoder-free NMT models.
no code implementations • IJCNLP 2019 • Gongbo Tang, Rico Sennrich, Joakim Nivre
We find that encoder hidden states outperform word embeddings significantly which indicates that encoders adequately encode relevant information for disambiguation into hidden states.
no code implementations • COLING 2020 • Gongbo Tang, Rico Sennrich, Joakim Nivre
The attention distribution pattern shows that separators attract a lot of attention and we explore a sparse word-level attention to enforce character hidden states to capture the full word-level information.
1 code implementation • 26 Jul 2021 • Gongbo Tang, Philipp Rönchen, Rico Sennrich, Joakim Nivre
In this paper, we evaluate the translation of negation both automatically and manually, in English--German (EN--DE) and English--Chinese (EN--ZH).
no code implementations • 28 May 2023 • Gongbo Tang, Christian Hardmeier
Coreference resolution is the task of finding expressions that refer to the same entity in a text.