no code implementations • NAACL 2022 • Sathish Reddy Indurthi, Mohd Abbas Zaidi, Beomseok Lee, Nikhil Kumar Lakumarapu, Sangha Kim
The state-of-the-art adaptive policies for Simultaneous Neural Machine Translation (SNMT) use monotonic attention to perform read/write decisions based on the partial source and target sequences.
no code implementations • 1 Jan 2021 • Sathish Reddy Indurthi, Mohd Abbas Zaidi, Nikhil Kumar Lakumarapu, Beomseok Lee, Hyojung Han, Sangha Kim, Inchul Hwang
Inspired by these learning patterns in humans, we suggest a simple yet generic task aware framework to incorporate into existing joint learning strategies.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +4
no code implementations • WS 2020 • Hou Jeung Han, Mohd Abbas Zaidi, Sathish Reddy Indurthi, Nikhil Kumar Lakumarapu, Beomseok Lee, Sangha Kim
In this paper, we describe end-to-end simultaneous speech-to-text and text-to-text translation systems submitted to IWSLT2020 online translation challenge.
no code implementations • WS 2020 • Nikhil Kumar Lakumarapu, Beomseok Lee, Sathish Reddy Indurthi, Hou Jeung Han, Mohd Abbas Zaidi, Sangha Kim
In this paper, we describe the system submitted to the IWSLT 2020 Offline Speech Translation Task.
Ranked #3 on Speech-to-Text Translation on MuST-C EN->DE (using extra training data)
Automatic Speech Recognition Automatic Speech Recognition (ASR) +6
no code implementations • 15 Feb 2020 • Chanwoo Kim, Kwangyoun Kim, Sathish Reddy Indurthi
More specifically, a time-frequency bin is masked if the filterbank energy in this bin is less than a certain energy threshold.
no code implementations • ACL 2019 • Sathish Reddy Indurthi, Insoo Chung, Sangha Kim
Soft-attention based Neural Machine Translation (NMT) models have achieved promising results on several translation tasks.
no code implementations • EMNLP 2018 • Seohyun Back, Seunghak Yu, Sathish Reddy Indurthi, Jihie Kim, Jaegul Choo
Machine reading comprehension helps machines learn to utilize most of the human knowledge written in the form of text.
Ranked #27 on Question Answering on TriviaQA (using extra training data)
no code implementations • EMNLP 2018 • Sathish Reddy Indurthi, Seunghak Yu, Seohyun Back, Heriberto Cuay{\'a}huitl
In recent years many deep neural networks have been proposed to solve Reading Comprehension (RC) tasks.
Ranked #4 on Question Answering on NarrativeQA
no code implementations • WS 2018 • Seunghak Yu, Sathish Reddy Indurthi, Seohyun Back, Haejun Lee
Reading Comprehension (RC) of text is one of the fundamental tasks in natural language processing.
Ranked #69 on Question Answering on SQuAD1.1