Search Results for author: Ramach

Found 8 papers, 0 papers with code

Unsupervised Pretraining for Sequence to Sequence Learning

no code implementations EMNLP 2017 Ramach, Prajit ran, Peter Liu, Quoc Le

We apply this method to challenging benchmarks in machine translation and abstractive summarization and find that it significantly improves the subsequent supervised models.

Abstractive Text Summarization Language Modelling +2

Cannot find the paper you are looking for? You can Submit a new open access paper.