Search Results for author: Adithya Renduchintala

Found 22 papers, 2 papers with code

The JHU/KyotoU Speech Translation System for IWSLT 2018

no code implementations IWSLT (EMNLP) 2018 Hirofumi Inaguma, Xuan Zhang, Zhiqi Wang, Adithya Renduchintala, Shinji Watanabe, Kevin Duh

This paper describes the Johns Hopkins University (JHU) and Kyoto University submissions to the Speech Translation evaluation campaign at IWSLT2018.

Transfer Learning Translation

Empowering Federated Learning for Massive Models with NVIDIA FLARE

no code implementations12 Feb 2024 Holger R. Roth, Ziyue Xu, Yuan-Ting Hsieh, Adithya Renduchintala, Isaac Yang, Zhihong Zhang, Yuhong Wen, Sean Yang, Kevin Lu, Kristopher Kersten, Camir Ricketts, Daguang Xu, Chester Chen, Yan Cheng, Andrew Feng

In the ever-evolving landscape of artificial intelligence (AI) and large language models (LLMs), handling and leveraging data effectively has become a critical challenge.

Federated Learning

Tied-Lora: Enhacing parameter efficiency of LoRA with weight tying

no code implementations16 Nov 2023 Adithya Renduchintala, Tugrul Konuk, Oleksii Kuchaiev

We propose Tied-LoRA, a simple paradigm utilizes weight tying and selective training to further increase parameter efficiency of the Low-rank adaptation (LoRA) method.

Multilingual Neural Machine Translation with Deep Encoder and Multiple Shallow Decoders

no code implementations EACL 2021 Xiang Kong, Adithya Renduchintala, James Cross, Yuqing Tang, Jiatao Gu, Xian Li

Recent work in multilingual translation advances translation quality surpassing bilingual baselines using deep transformer models with increased capacity.

Machine Translation Translation

Investigating Failures of Automatic Translation in the Case of Unambiguous Gender

no code implementations ACL 2022 Adithya Renduchintala, Adina Williams

Transformer based models are the modern work horses for neural machine translation (NMT), reaching state of the art across several benchmarks.

Machine Translation NMT +1

Towards Understanding the Behaviors of Optimal Deep Active Learning Algorithms

1 code implementation29 Dec 2020 Yilun Zhou, Adithya Renduchintala, Xian Li, Sida Wang, Yashar Mehdad, Asish Ghoshal

Active learning (AL) algorithms may achieve better performance with fewer data because the model guides the data selection process.

Active Learning

An Exploratory Study on Multilingual Quality Estimation

no code implementations Asian Chapter of the Association for Computational Linguistics 2020 Shuo Sun, Marina Fomicheva, Fr{\'e}d{\'e}ric Blain, Vishrav Chaudhary, Ahmed El-Kishky, Adithya Renduchintala, Francisco Guzm{\'a}n, Lucia Specia

Predicting the quality of machine translation has traditionally been addressed with language-specific models, under the assumption that the quality label distribution or linguistic features exhibit traits that are not shared across languages.

Machine Translation Translation

Spelling-Aware Construction of Macaronic Texts for Teaching Foreign-Language Vocabulary

no code implementations IJCNLP 2019 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We present a machine foreign-language teacher that modifies text in a student{'}s native language (L1) by replacing some word tokens with glosses in a foreign language (L2), in such a way that the student can acquire L2 vocabulary simply by reading the resulting macaronic text.

Language Modelling

Simple Construction of Mixed-Language Texts for Vocabulary Learning

no code implementations WS 2019 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We accomplish this by modifying a cloze language model to incrementally learn new vocabulary items, and use this language model as a proxy for the word guessing and learning ability of real students.

Language Modelling

A Call for Prudent Choice of Subword Merge Operations in Neural Machine Translation

no code implementations WS 2019 Shuoyang Ding, Adithya Renduchintala, Kevin Duh

Most neural machine translation systems are built upon subword units extracted by methods such as Byte-Pair Encoding (BPE) or wordpiece.

Machine Translation Translation

Character-Aware Decoder for Translation into Morphologically Rich Languages

no code implementations WS 2019 Adithya Renduchintala, Pamela Shapiro, Kevin Duh, Philipp Koehn

Neural machine translation (NMT) systems operate primarily on words (or sub-words), ignoring lower-level patterns of morphology.

Machine Translation NMT +1

Multi-Modal Data Augmentation for End-to-End ASR

no code implementations27 Mar 2018 Adithya Renduchintala, Shuoyang Ding, Matthew Wiesner, Shinji Watanabe

We present a new end-to-end architecture for automatic speech recognition (ASR) that can be trained using \emph{symbolic} input in addition to the traditional acoustic input.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +3

Knowledge Tracing in Sequential Learning of Inflected Vocabulary

no code implementations CONLL 2017 Adithya Renduchintala, Philipp Koehn, Jason Eisner

We present a feature-rich knowledge tracing method that captures a student{'}s acquisition and retention of knowledge during a foreign language phrase learning task.

Knowledge Tracing Structured Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.