Unsupervised Sentence Compression
4 papers with code • 0 benchmarks • 0 datasets
Producing a shorter sentence by removing redundant information, preserving the grammatically and the important content of the original sentence without supervision. (Source: nlpprogress.com)
Benchmarks
These leaderboards are used to track progress in Unsupervised Sentence Compression
Most implemented papers
Unsupervised Sentence Compression using Denoising Auto-Encoders
In sentence compression, the task of shortening sentences while retaining the original meaning, models tend to be trained on large corpora containing pairs of verbose and compressed sentences.
SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression
The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.
SEQ\^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression
The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.
Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning
Sentence compression reduces the length of text by removing non-essential content while preserving important facts and grammaticality.