Sentence Compression

22 papers with code • 1 benchmarks • 2 datasets

Sentence Compression is the task of reducing the length of text by removing non-essential content while preserving important facts and grammaticality.

Most implemented papers

SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

cbaziotis/seq3 7 Apr 2019

The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.

SEQ\^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

cbaziotis/seq3 NAACL 2019

The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.

Explicit Sentence Compression for Neural Machine Translation

bcmi220/esc4nmt 27 Dec 2019

In this paper, we propose an explicit sentence compression method to enhance the source sentence representation for NMT.

Syntactically Look-Ahead Attention Network for Sentence Compression

kamigaito/SLAHAN 4 Feb 2020

Sentence compression is the task of compressing a long sentence into a short one by deleting redundant words.

SCAR: Sentence Compression using Autoencoders for Reconstruction

m-chanakya/scar ACL 2020

The compressor masks the input, and the reconstructor tries to regenerate it.

Evaluation Discrepancy Discovery: A Sentence Compression Case-study

UKPLab/arxiv2021-evaluation-discrepancy-nsc 22 Jan 2021

Reliable evaluation protocols are of utmost importance for reproducible NLP research.

Non-Autoregressive Text Generation with Pre-trained Language Models

yxuansu/NAG-BERT EACL 2021

In this work, we show that BERT can be employed as the backbone of a NAG model to greatly improve performance.

Contextualized Semantic Distance between Highly Overlapped Texts

Stareru/NeighboringDistributionDivergence 4 Oct 2021

Overlapping frequently occurs in paired texts in natural language processing tasks like text editing and semantic similarity evaluation.

A Novel Metric for Evaluating Semantics Preservation

Stareru/NeighboringDistributionDivergence ACL ARR October 2021

By exploiting the property of NDD, we implement a unsupervised and even training-free algorithm for extractive sentence compression.

Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning

complementizer/rl-sentence-compression ACL 2022

Sentence compression reduces the length of text by removing non-essential content while preserving important facts and grammaticality.