Search Results for author: David Wan

Found 5 papers, 2 papers with code

FactPEGASUS: Factuality-Aware Pre-training and Fine-tuning for Abstractive Summarization

1 code implementation16 May 2022 David Wan, Mohit Bansal

We present FactPEGASUS, an abstractive summarization model that addresses the problem of factuality during pre-training and fine-tuning: (1) We augment the sentence selection strategy of PEGASUS's (Zhang et al., 2020) pre-training objective to create pseudo-summaries that are both important and factual; (2) We introduce three complementary components for fine-tuning.

Abstractive Text Summarization Contrastive Learning

Segmenting Subtitles for Correcting ASR Segmentation Errors

no code implementations EACL 2021 David Wan, Chris Kedzie, Faisal Ladhak, Elsbeth Turcan, Petra Galuščáková, Elena Zotkina, Zhengping Jiang, Peter Bell, Kathleen McKeown

Typical ASR systems segment the input audio into utterances using purely acoustic information, which may not resemble the sentence-like units that are expected by conventional machine translation (MT) systems for Spoken Language Translation.

Information Retrieval Machine Translation +1

Incorporating Terminology Constraints in Automatic Post-Editing

1 code implementation WMT (EMNLP) 2020 David Wan, Chris Kedzie, Faisal Ladhak, Marine Carpuat, Kathleen McKeown

In this paper, we present both autoregressive and non-autoregressive models for lexically constrained APE, demonstrating that our approach enables preservation of 95% of the terminologies and also improves translation quality on English-German benchmarks.

Automatic Post-Editing Data Augmentation +1

Subtitles to Segmentation: Improving Low-Resource Speech-to-Text Translation Pipelines

no code implementations19 Oct 2020 David Wan, Zhengping Jiang, Chris Kedzie, Elsbeth Turcan, Peter Bell, Kathleen McKeown

In this work, we focus on improving ASR output segmentation in the context of low-resource language speech-to-text translation.

Information Retrieval POS +3

Cannot find the paper you are looking for? You can Submit a new open access paper.