Headline Generation
23 papers with code • 1 benchmarks • 1 datasets
Benchmarks
These leaderboards are used to track progress in Headline Generation
Trend | Dataset | Best Model | Paper | Code | Compare |
---|
Most implemented papers
Deep Reinforcement Learning For Sequence to Sequence Models
In this survey, we consider seq2seq problems from the RL point of view and provide a formulation combining the power of RL methods in decision-making with sequence-to-sequence models that enable remembering long-term memories.
Don't Just Scratch the Surface: Enhancing Word Representations for Korean with Hanja
We propose a simple yet effective approach for improving Korean word representations using additional linguistic annotation (i. e. Hanja).
IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation
The T5 model and its unified text-to-text paradigm contributed in advancing the state-of-the-art for many natural language processing tasks.
Question Answering as an Automatic Evaluation Metric for News Article Summarization
Recent work in the field of automatic summarization and headline generation focuses on maximizing ROUGE scores for various news datasets.
Advances of Transformer-Based Models for News Headline Generation
Pretrained language models based on Transformer architecture are the reason for recent breakthroughs in many areas of NLP, including sentiment analysis, question answering, named entity recognition.
Few-Shot Text Generation with Pattern-Exploiting Training
Providing pretrained language models with simple task descriptions in natural language enables them to solve some tasks in a fully unsupervised fashion.
Direct Output Connection for a High-Rank Language Model
This paper proposes a state-of-the-art recurrent neural network (RNN) language model that combines probability distributions computed not only from a final RNN layer but also from middle layers.
Importance of Copying Mechanism for News Headline Generation
News headline generation is an essential problem of text summarization because it is constrained, well-defined, and is still hard to solve.
Data-efficient Neural Text Compression with Interactive Learning
Neural sequence-to-sequence models have been successfully applied to text compression.
This Email Could Save Your Life: Introducing the Task of Email Subject Line Generation
In this paper, we propose and study the task of email subject line generation: automatically generating an email subject line from the email body.