Unsupervised Sentence Compression

4 papers with code • 0 benchmarks • 0 datasets

Producing a shorter sentence by removing redundant information, preserving the grammatically and the important content of the original sentence without supervision. (Source: nlpprogress.com)

Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning

complementizer/rl-sentence-compression ACL 2022

Sentence compression reduces the length of text by removing non-essential content while preserving important facts and grammaticality.

29
17 May 2022

SEQ\^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

cbaziotis/seq3 NAACL 2019

The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.

124
01 Jun 2019

SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

cbaziotis/seq3 7 Apr 2019

The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.

124
07 Apr 2019

Unsupervised Sentence Compression using Denoising Auto-Encoders

zphang/usc_dae CONLL 2018

In sentence compression, the task of shortening sentences while retaining the original meaning, models tend to be trained on large corpora containing pairs of verbose and compressed sentences.

47
07 Sep 2018