no code implementations • 18 Jan 2019 • Jasmijn Bastings, Wilker Aziz, Ivan Titov, Khalil Sima'an
Recently it was shown that linguistic structure predicted by a supervised parser can be beneficial for neural machine translation (NMT).
1 code implementation • NAACL 2018 • Miguel Rios, Wilker Aziz, Khalil Sima'an
This work exploits translation data as a source of semantically relevant learning signal for models of word representation.
no code implementations • EMNLP 2017 • Jasmijn Bastings, Ivan Titov, Wilker Aziz, Diego Marcheggiani, Khalil Sima'an
We present a simple and effective approach to incorporating syntactic structure into neural attention-based encoder-decoder models for machine translation.
2 code implementations • WS 2016 • Desmond Elliott, Stella Frank, Khalil Sima'an, Lucia Specia
We introduce the Multi30K dataset to stimulate multilingual multimodal research.