Extractive Summarization
85 papers with code • 0 benchmarks • 1 datasets
Benchmarks
These leaderboards are used to track progress in Extractive Summarization
Libraries
Use these libraries to find Extractive Summarization models and implementationsMost implemented papers
Fine-tune BERT for Extractive Summarization
BERT, a pre-trained Transformer model, has achieved ground-breaking performance on multiple NLP tasks.
Leveraging BERT for Extractive Text Summarization on Lectures
This paper reports on the project called Lecture Summarization Service, a python based RESTful service that utilizes the BERT model for text embeddings and KMeans clustering to identify sentences closes to the centroid for summary selection.
SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive Summarization of Documents
We present SummaRuNNer, a Recurrent Neural Network (RNN) based sequence model for extractive summarization of documents and show that it achieves performance better than or comparable to state-of-the-art.
Generating Wikipedia by Summarizing Long Sequences
We show that generating English Wikipedia articles can be approached as a multi- document summarization of source documents.
AREDSUM: Adaptive Redundancy-Aware Iterative Sentence Ranking for Extractive Document Summarization
Redundancy-aware extractive summarization systems score the redundancy of the sentences to be included in a summary either jointly with their salience information or separately as an additional sentence scoring step.
Neural Summarization by Extracting Sentences and Words
Traditional approaches to extractive summarization rely heavily on human-engineered features.
Diversity driven Attention Model for Query-based Abstractive Summarization
Abstractive summarization aims to generate a shorter version of the document covering all the salient points in a compact and coherent fashion.
Extractive Summarization using Deep Learning
We are exploring various features to improve the set of sentences selected for the summary, and are using a Restricted Boltzmann Machine to enhance and abstract those features to improve resultant accuracy without losing any important information.
Multi-News: a Large-Scale Multi-Document Summarization Dataset and Abstractive Hierarchical Model
Automatic generation of summaries from multiple news articles is a valuable tool as the number of online publications grows rapidly.
Self-Supervised Learning for Contextualized Extractive Summarization
Existing models for extractive summarization are usually trained from scratch with a cross-entropy loss, which does not explicitly capture the global context at the document level.