no code implementations • LREC 2022 • Prajit Dhar, Arianna Bisazza, Gertjan van Noord
We conduct our evaluation on four typologically diverse target MRLs, and find that PT-Inflect surpasses NMT systems trained only on parallel data.
no code implementations • WMT (EMNLP) 2020 • Prajit Dhar, Arianna Bisazza, Gertjan van Noord
This paper describes our submission for the English-Tamil news translation task of WMT-2020.
no code implementations • ACL (WAT) 2021 • Prajit Dhar, Arianna Bisazza, Gertjan van Noord
Dravidian languages, such as Kannada and Tamil, are notoriously difficult to translate by state-of-the-art neural models.
no code implementations • CoNLL (EMNLP) 2021 • Ekta Sood, Fabian Kögel, Florian Strohm, Prajit Dhar, Andreas Bulling
We present VQA-MHUG - a novel 49-participant dataset of multimodal human gaze on both images and questions during visual question answering (VQA) collected using a high-speed eye tracker.
no code implementations • NoDaLiDa 2021 • Prajit Dhar, Arianna Bisazza
It is now established that modern neural language models can be successfully trained on multiple languages simultaneously without changes to the underlying architecture.
1 code implementation • WS 2019 • Prajit Dhar, Lonneke van der Plas
We introduce temporally and contextually-aware models for the novel task of predicting unseen but plausible concepts, as conveyed by noun-noun compounds in a time-stamped corpus.
1 code implementation • WS 2019 • Prajit Dhar, Janis Pagel, Lonneke van der Plas
We present work in progress on the temporal progression of compositionality in noun-noun compounds.
no code implementations • WS 2018 • Prajit Dhar, Arianna Bisazza
Recent work has shown that neural models can be successfully trained on multiple languages simultaneously.