Unsupervised Sentence Compression

2 papers with code • 0 benchmarks • 0 datasets

Producing a shorter sentence by removing redundant information, preserving the grammatically and the important content of the original sentence without supervision. (Source: nlpprogress.com)

Datasets


Greatest papers with code

SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

cbaziotis/seq3 7 Apr 2019

The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.

Language Modelling Unsupervised Sentence Compression

Unsupervised Sentence Compression using Denoising Auto-Encoders

zphang/usc_dae CONLL 2018

In sentence compression, the task of shortening sentences while retaining the original meaning, models tend to be trained on large corpora containing pairs of verbose and compressed sentences.

Denoising Unsupervised Sentence Compression