Browse > Natural Language Processing > Text Summarization using textRank

Text Summarization using textRank

53 papers with code · Natural Language Processing

Text summarization is the task of distilling noteworthy information in a document to produce an abridged version of it

State-of-the-art leaderboards

Greatest papers with code

Levenshtein Transformer

27 May 2019pytorch/fairseq

We further confirm the flexibility of our model by showing a Levenshtein Transformer trained by machine translation can straightforwardly be used for automatic post-editing.

AUTOMATIC POST-EDITING MACHINE TRANSLATION TEXT SUMMARIZATION USING TEXTRANK

Get To The Point: Summarization with Pointer-Generator Networks

ACL 2017 abisee/pointer-generator

Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text).

ABSTRACTIVE TEXT SUMMARIZATION

Neural Abstractive Text Summarization with Sequence-to-Sequence Models

5 Dec 2018shibing624/pycorrector

As part of this survey, we also develop an open source library, namely Neural Abstractive Text Summarizer (NATS) toolkit, for the abstractive text summarization.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION

MASS: Masked Sequence to Sequence Pre-training for Language Generation

7 May 2019microsoft/MASS

Pre-training and fine-tuning, e. g., BERT, have achieved great success in language understanding by transferring knowledge from rich-resource pre-training task to the low/zero-resource downstream tasks.

TEXT GENERATION TEXT SUMMARIZATION USING TEXTRANK UNSUPERVISED MACHINE TRANSLATION

Deep Reinforcement Learning For Sequence to Sequence Models

24 May 2018yaserkl/RLSeq2Seq

In this survey, we consider seq2seq problems from the RL point of view and provide a formulation combining the power of RL methods in decision-making with sequence-to-sequence models that enable remembering long-term memories.

ABSTRACTIVE TEXT SUMMARIZATION DECISION MAKING MACHINE TRANSLATION

Text Summarization with Pretrained Encoders

22 Aug 2019nlpyang/PreSumm

For abstractive summarization, we propose a new fine-tuning schedule which adopts different optimizers for the encoder and the decoder as a means of alleviating the mismatch between the two (the former is pretrained while the latter is not).

 SOTA for Extractive Document Summarization on CNN / Daily Mail (using extra training data)

ABSTRACTIVE TEXT SUMMARIZATION DOCUMENT SUMMARIZATION EXTRACTIVE DOCUMENT SUMMARIZATION

Unified Language Model Pre-training for Natural Language Understanding and Generation

8 May 2019microsoft/unilm

This paper presents a new Unified pre-trained Language Model (UniLM) that can be fine-tuned for both natural language understanding and generation tasks.

 SOTA for Text Summarization using textRank on GigaWord (using extra training data)

ABSTRACTIVE TEXT SUMMARIZATION DOCUMENT SUMMARIZATION LANGUAGE MODELLING QUESTION ANSWERING QUESTION GENERATION TEXT GENERATION

A Regularized Framework for Sparse and Structured Neural Attention

NeurIPS 2017 vene/sparse-structured-attention

Modern neural networks are often augmented with an attention mechanism, which tells the network where to focus within the input.

MACHINE TRANSLATION NATURAL LANGUAGE INFERENCE TEXT SUMMARIZATION USING TEXTRANK

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

CONLL 2016 theamrzaki/text_summurization_abstractive_methods

In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora.

ABSTRACTIVE TEXT SUMMARIZATION

WikiHow: A Large Scale Text Summarization Dataset

18 Oct 2018mahnazkoupaee/WikiHow-Dataset

Sequence-to-sequence models have recently gained the state of the art performance in summarization.

TEXT SUMMARIZATION USING TEXTRANK