Search Results for author: Adithya Renduchintala

Found 19 papers, 2 papers with code

Investigating Failures of Automatic Translation in the Case of Unambiguous Gender

no code implementations16 Apr 2021 Adithya Renduchintala, Adina Williams

Transformer based models are the modern work horses for neural machine translation (NMT), reaching state of the art across several benchmarks.

Machine Translation Translation

Multilingual Neural Machine Translation with Deep Encoder and Multiple Shallow Decoders

no code implementations EACL 2021 Xiang Kong, Adithya Renduchintala, James Cross, Yuqing Tang, Jiatao Gu, Xian Li

Recent work in multilingual translation advances translation quality surpassing bilingual baselines using deep transformer models with increased capacity.

Machine Translation Translation

Towards Understanding the Behaviors of Optimal Deep Active Learning Algorithms

1 code implementation29 Dec 2020 Yilun Zhou, Adithya Renduchintala, Xian Li, Sida Wang, Yashar Mehdad, Asish Ghoshal

Active learning (AL) algorithms may achieve better performance with fewer data because the model guides the data selection process.

Active Learning

An Exploratory Study on Multilingual Quality Estimation

no code implementations Asian Chapter of the Association for Computational Linguistics 2020 Shuo Sun, Marina Fomicheva, Fr{\'e}d{\'e}ric Blain, Vishrav Chaudhary, Ahmed El-Kishky, Adithya Renduchintala, Francisco Guzm{\'a}n, Lucia Specia

Predicting the quality of machine translation has traditionally been addressed with language-specific models, under the assumption that the quality label distribution or linguistic features exhibit traits that are not shared across languages.

Machine Translation Translation

Spelling-Aware Construction of Macaronic Texts for Teaching Foreign-Language Vocabulary

no code implementations IJCNLP 2019 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We present a machine foreign-language teacher that modifies text in a student{'}s native language (L1) by replacing some word tokens with glosses in a foreign language (L2), in such a way that the student can acquire L2 vocabulary simply by reading the resulting macaronic text.

Language Modelling

Simple Construction of Mixed-Language Texts for Vocabulary Learning

no code implementations WS 2019 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We accomplish this by modifying a cloze language model to incrementally learn new vocabulary items, and use this language model as a proxy for the word guessing and learning ability of real students.

Language Modelling

A Call for Prudent Choice of Subword Merge Operations in Neural Machine Translation

no code implementations WS 2019 Shuoyang Ding, Adithya Renduchintala, Kevin Duh

Most neural machine translation systems are built upon subword units extracted by methods such as Byte-Pair Encoding (BPE) or wordpiece.

Machine Translation Translation

Character-Aware Decoder for Translation into Morphologically Rich Languages

no code implementations WS 2019 Adithya Renduchintala, Pamela Shapiro, Kevin Duh, Philipp Koehn

Neural machine translation (NMT) systems operate primarily on words (or sub-words), ignoring lower-level patterns of morphology.

Machine Translation Translation

Multi-Modal Data Augmentation for End-to-End ASR

no code implementations27 Mar 2018 Adithya Renduchintala, Shuoyang Ding, Matthew Wiesner, Shinji Watanabe

We present a new end-to-end architecture for automatic speech recognition (ASR) that can be trained using \emph{symbolic} input in addition to the traditional acoustic input.

automatic-speech-recognition Data Augmentation +3

Knowledge Tracing in Sequential Learning of Inflected Vocabulary

no code implementations CONLL 2017 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We present a feature-rich knowledge tracing method that captures a student{'}s acquisition and retention of knowledge during a foreign language phrase learning task.

Knowledge Tracing Structured Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.