Sentence Compression
22 papers with code • 1 benchmarks • 2 datasets
Sentence Compression is the task of reducing the length of text by removing non-essential content while preserving important facts and grammaticality.
Most implemented papers
SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression
The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.
SEQ\^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression
The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.
Explicit Sentence Compression for Neural Machine Translation
In this paper, we propose an explicit sentence compression method to enhance the source sentence representation for NMT.
Syntactically Look-Ahead Attention Network for Sentence Compression
Sentence compression is the task of compressing a long sentence into a short one by deleting redundant words.
SCAR: Sentence Compression using Autoencoders for Reconstruction
The compressor masks the input, and the reconstructor tries to regenerate it.
Evaluation Discrepancy Discovery: A Sentence Compression Case-study
Reliable evaluation protocols are of utmost importance for reproducible NLP research.
Non-Autoregressive Text Generation with Pre-trained Language Models
In this work, we show that BERT can be employed as the backbone of a NAG model to greatly improve performance.
Contextualized Semantic Distance between Highly Overlapped Texts
Overlapping frequently occurs in paired texts in natural language processing tasks like text editing and semantic similarity evaluation.
A Novel Metric for Evaluating Semantics Preservation
By exploiting the property of NDD, we implement a unsupervised and even training-free algorithm for extractive sentence compression.
Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning
Sentence compression reduces the length of text by removing non-essential content while preserving important facts and grammaticality.