Search Results for author: Shuoyang Ding

Found 16 papers, 7 papers with code

Fine-Tuned Machine Translation Metrics Struggle in Unseen Domains

1 code implementation28 Feb 2024 Vilém Zouhar, Shuoyang Ding, Anna Currey, Tatyana Badeka, Jenyuan Wang, Brian Thompson

We introduce a new, extensive multidimensional quality metrics (MQM) annotated dataset covering 11 language pairs in the biomedical domain.

Machine Translation Translation

Levenshtein Training for Word-level Quality Estimation

1 code implementation EMNLP 2021 Shuoyang Ding, Marcin Junczys-Dowmunt, Matt Post, Philipp Koehn

We propose a novel scheme to use the Levenshtein Transformer to perform the task of word-level quality estimation.

Transfer Learning Translation

Evaluating Saliency Methods for Neural Language Models

1 code implementation NAACL 2021 Shuoyang Ding, Philipp Koehn

Saliency methods are widely used to interpret neural network predictions, but different variants of saliency methods often disagree even on the interpretations of the same prediction made by the same model.

Sentence

Espresso: A Fast End-to-end Neural Speech Recognition Toolkit

1 code implementation18 Sep 2019 Yiming Wang, Tongfei Chen, Hainan Xu, Shuoyang Ding, Hang Lv, Yiwen Shao, Nanyun Peng, Lei Xie, Shinji Watanabe, Sanjeev Khudanpur

We present Espresso, an open-source, modular, extensible end-to-end neural automatic speech recognition (ASR) toolkit based on the deep learning library PyTorch and the popular neural machine translation toolkit fairseq.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +5

Saliency-driven Word Alignment Interpretation for Neural Machine Translation

1 code implementation WS 2019 Shuoyang Ding, Hainan Xu, Philipp Koehn

Despite their original goal to jointly learn to align and translate, Neural Machine Translation (NMT) models, especially Transformer, are often perceived as not learning interpretable word alignments.

Machine Translation NMT +2

A Call for Prudent Choice of Subword Merge Operations in Neural Machine Translation

no code implementations WS 2019 Shuoyang Ding, Adithya Renduchintala, Kevin Duh

Most neural machine translation systems are built upon subword units extracted by methods such as Byte-Pair Encoding (BPE) or wordpiece.

Machine Translation Translation

Parallelizable Stack Long Short-Term Memory

1 code implementation WS 2019 Shuoyang Ding, Philipp Koehn

Stack Long Short-Term Memory (StackLSTM) is useful for various applications such as parsing and string-to-tree neural machine translation, but it is also known to be notoriously difficult to parallelize for GPU training due to the fact that the computations are dependent on discrete operations.

Machine Translation Translation

How Do Source-side Monolingual Word Embeddings Impact Neural Machine Translation?

no code implementations5 Jun 2018 Shuoyang Ding, Kevin Duh

Using pre-trained word embeddings as input layer is a common practice in many natural language processing (NLP) tasks, but it is largely neglected for neural machine translation (NMT).

Machine Translation NMT +2

Multi-Modal Data Augmentation for End-to-End ASR

no code implementations27 Mar 2018 Adithya Renduchintala, Shuoyang Ding, Matthew Wiesner, Shinji Watanabe

We present a new end-to-end architecture for automatic speech recognition (ASR) that can be trained using \emph{symbolic} input in addition to the traditional acoustic input.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +3

Cannot find the paper you are looking for? You can Submit a new open access paper.