no code implementations • AMTA 2016 • Hamidreza Ghader, Christof Monz
Lexicalized and hierarchical reordering models use relative frequencies of fully lexicalized phrase pairs to learn phrase reordering distributions.
no code implementations • 4 Mar 2021 • Hamidreza Ghader
We investigate the extent to which syntactic and lexical-semantic information from the source side is captured by hidden state representations of different neural MT architectures.
no code implementations • 21 Jan 2020 • Tiphaine Viard, Thomas McLachlan, Hamidreza Ghader, Satoshi Sekine
Wikipedia is a huge opportunity for machine learning, being the largest semi-structured base of knowledge available.
no code implementations • WS 2019 • Hamidreza Ghader, Christof Monz
We compare transformer and recurrent models in a more intrinsic way in terms of capturing lexical semantics and syntactic structures, in contrast to extrinsic approaches used by previous works.
no code implementations • IJCNLP 2017 • Hamidreza Ghader, Christof Monz
Thus, the question still remains that how attention is similar or different from the traditional alignment.