Search Results for author: Shuoyang Ding

Found 15 papers, 5 papers with code

Levenshtein Training for Word-level Quality Estimation

1 code implementation EMNLP 2021 Shuoyang Ding, Marcin Junczys-Dowmunt, Matt Post, Philipp Koehn

We propose a novel scheme to use the Levenshtein Transformer to perform the task of word-level quality estimation.

Transfer Learning Translation

Evaluating Saliency Methods for Neural Language Models

1 code implementation NAACL 2021 Shuoyang Ding, Philipp Koehn

Saliency methods are widely used to interpret neural network predictions, but different variants of saliency methods often disagree even on the interpretations of the same prediction made by the same model.

Espresso: A Fast End-to-end Neural Speech Recognition Toolkit

1 code implementation18 Sep 2019 Yiming Wang, Tongfei Chen, Hainan Xu, Shuoyang Ding, Hang Lv, Yiwen Shao, Nanyun Peng, Lei Xie, Shinji Watanabe, Sanjeev Khudanpur

We present Espresso, an open-source, modular, extensible end-to-end neural automatic speech recognition (ASR) toolkit based on the deep learning library PyTorch and the popular neural machine translation toolkit fairseq.

 Ranked #1 on Speech Recognition on Hub5'00 SwitchBoard (Eval2000 metric)

Automatic Speech Recognition Data Augmentation +2

Saliency-driven Word Alignment Interpretation for Neural Machine Translation

1 code implementation WS 2019 Shuoyang Ding, Hainan Xu, Philipp Koehn

Despite their original goal to jointly learn to align and translate, Neural Machine Translation (NMT) models, especially Transformer, are often perceived as not learning interpretable word alignments.

Machine Translation Translation +1

A Call for Prudent Choice of Subword Merge Operations in Neural Machine Translation

no code implementations WS 2019 Shuoyang Ding, Adithya Renduchintala, Kevin Duh

Most neural machine translation systems are built upon subword units extracted by methods such as Byte-Pair Encoding (BPE) or wordpiece.

Machine Translation Translation

Parallelizable Stack Long Short-Term Memory

1 code implementation WS 2019 Shuoyang Ding, Philipp Koehn

Stack Long Short-Term Memory (StackLSTM) is useful for various applications such as parsing and string-to-tree neural machine translation, but it is also known to be notoriously difficult to parallelize for GPU training due to the fact that the computations are dependent on discrete operations.

Machine Translation Translation

Improving End-to-end Speech Recognition with Pronunciation-assisted Sub-word Modeling

no code implementations10 Nov 2018 Hainan Xu, Shuoyang Ding, Shinji Watanabe

Most end-to-end speech recognition systems model text directly as a sequence of characters or sub-words.

Automatic Speech Recognition

How Do Source-side Monolingual Word Embeddings Impact Neural Machine Translation?

no code implementations5 Jun 2018 Shuoyang Ding, Kevin Duh

Using pre-trained word embeddings as input layer is a common practice in many natural language processing (NLP) tasks, but it is largely neglected for neural machine translation (NMT).

Machine Translation Translation +1

Multi-Modal Data Augmentation for End-to-End ASR

no code implementations27 Mar 2018 Adithya Renduchintala, Shuoyang Ding, Matthew Wiesner, Shinji Watanabe

We present a new end-to-end architecture for automatic speech recognition (ASR) that can be trained using \emph{symbolic} input in addition to the traditional acoustic input.

Automatic Speech Recognition Data Augmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.