Search Results for author: Insoo Chung

Found 4 papers, 0 papers with code

Look Harder: A Neural Machine Translation Model with Hard Attention

no code implementations ACL 2019 Sathish Reddy Indurthi, Insoo Chung, Sangha Kim

Soft-attention based Neural Machine Translation (NMT) models have achieved promising results on several translation tasks.

Hard Attention Machine Translation +3

Data Efficient Direct Speech-to-Text Translation with Modality Agnostic Meta-Learning

no code implementations11 Nov 2019 Sathish Indurthi, Houjeung Han, Nikhil Kumar Lakumarapu, Beomseok Lee, Insoo Chung, Sangha Kim, Chanwoo Kim

In the meta-learning phase, the parameters of the model are exposed to vast amounts of speech transcripts (e. g., English ASR) and text translations (e. g., English-German MT).

Automatic Speech Recognition Automatic Speech Recognition (ASR) +6

Monotonic Simultaneous Translation with Chunk-wise Reordering and Refinement

no code implementations WMT (EMNLP) 2021 Hyojung Han, Seokchan Ahn, Yoonjung Choi, Insoo Chung, Sangha Kim, Kyunghyun Cho

Recent work in simultaneous machine translation is often trained with conventional full sentence translation corpora, leading to either excessive latency or necessity to anticipate as-yet-unarrived words, when dealing with a language pair whose word orders significantly differ.

Machine Translation Sentence +2

Cannot find the paper you are looking for? You can Submit a new open access paper.