Sentence Compression

20 papers with code • 1 benchmarks • 2 datasets

Sentence compression produces a shorter sentence by removing redundant information, preserving the grammatically and the important content of the original sentence.

Most implemented papers

Unsupervised Abstractive Meeting Summarization with Multi-Sentence Compression and Budgeted Submodular Maximization

dascim/acl2018_abssumm ACL 2018

We introduce a novel graph-based framework for abstractive meeting speech summarization that is fully unsupervised and does not rely on any annotations.

Globally Normalized Transition-Based Neural Networks

tensorflow/models ACL 2016

Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models.

Sentence Simplification with Deep Reinforcement Learning

XingxingZhang/dress EMNLP 2017

Sentence simplification aims to make sentences easier to read and understand.

Combining Graph Degeneracy and Submodularity for Unsupervised Extractive Summarization

Tixierae/EMNLP2017_NewSum WS 2017

We present a fully unsupervised, extractive text summarization system that leverages a submodularity framework introduced by past research.

Learning How to Simplify From Explicit Labeling of Complex-Simplified Text Pairs

ghpaetzold/massalign IJCNLP 2017

Current research in text simplification has been hampered by two central problems: (i) the small amount of high-quality parallel simplification data available, and (ii) the lack of explicit annotations of simplification operations, such as deletions or substitutions, on existing data.

Sequence-to-sequence Models for Cache Transition Systems

xiaochang13/CacheTransition-Seq2seq ACL 2018

In this paper, we present a sequence-to-sequence based approach for mapping natural language sentences to AMR semantic graphs.

Unsupervised Semantic Abstractive Summarization

shibhansh/Unsupervised-SAS ACL 2018

Automatic abstractive summary generation remains a significant open problem for natural language processing.

Unsupervised Sentence Compression using Denoising Auto-Encoders

zphang/usc_dae CONLL 2018

In sentence compression, the task of shortening sentences while retaining the original meaning, models tend to be trained on large corpora containing pairs of verbose and compressed sentences.

Sentence Compression for Arbitrary Languages via Multilingual Pivoting

Jmallins/MOSS EMNLP 2018

In this paper we advocate the use of bilingual corpora which are abundantly available for training sentence compression models.