no code implementations • EAMT 2022 • Nishant Kambhatla, Logan Born, Anoop Sarkar
We propose a novel technique that combines alternative subword tokenizations of a single source-target language pair that allows us to leverage multilingual neural translation training methods.
1 code implementation • ACL 2022 • Nishant Kambhatla, Logan Born, Anoop Sarkar
We propose a novel data-augmentation technique for neural machine translation based on ROT-$k$ ciphertexts.
Ranked #9 on Machine Translation on IWSLT2014 German-English
no code implementations • EACL 2021 • Pooya Moradi, Nishant Kambhatla, Anoop Sarkar
While the attention heatmaps produced by neural machine translation (NMT) models seem insightful, there is little evidence that they reflect a model{'}s true internal reasoning.
no code implementations • Asian Chapter of the Association for Computational Linguistics 2020 • Pooya Moradi, Nishant Kambhatla, Anoop Sarkar
Can we trust that the attention heatmaps produced by a neural machine translation (NMT) model reflect its true internal reasoning?
1 code implementation • WS 2019 • Pooya Moradi, Nishant Kambhatla, Anoop Sarkar
Attention models have become a crucial component in neural machine translation (NMT).
1 code implementation • WS 2019 • Logan Born, Kate Kelley, Nishant Kambhatla, Carolyn Chen, Anoop Sarkar
We describe a first attempt at using techniques from computational linguistics to analyze the undeciphered proto-Elamite script.
no code implementations • WS 2018 • Zhelun Wu, Nishant Kambhatla, Anoop Sarkar
Automated filters are commonly used by online services to stop users from sending age-inappropriate, bullying messages, or asking others to expose personal information.