Informativeness
244 papers with code • 1 benchmarks • 1 datasets
Libraries
Use these libraries to find Informativeness models and implementationsMost implemented papers
Learning to Navigate for Fine-grained Classification
In consideration of intrinsic consistency between informativeness of the regions and their probability being ground-truth class, we design a novel training paradigm, which enables Navigator to detect most informative regions under the guidance from Teacher.
NRPA: Neural Recommendation with Personalized Attention
In this paper we propose a neural recommendation approach with personalized attention to learn personalized representations of users and items from reviews.
Unsupervised Abstractive Meeting Summarization with Multi-Sentence Compression and Budgeted Submodular Maximization
We introduce a novel graph-based framework for abstractive meeting speech summarization that is fully unsupervised and does not rely on any annotations.
Generating Informative and Diverse Conversational Responses via Adversarial Information Maximization
Responses generated by neural conversational models tend to lack informativeness and diversity.
Large Language Models Are Human-Level Prompt Engineers
By conditioning on natural language instructions, large language models (LLMs) have displayed impressive capabilities as general-purpose computers.
Response Generation by Context-aware Prototype Editing
Open domain response generation has achieved remarkable progress in recent years, but sometimes yields short and uninformative responses.
BARTScore: Evaluating Generated Text as Text Generation
In this work, we conceptualize the evaluation of generated text as a text generation problem, modeled using pre-trained sequence-to-sequence models.
Dataset Distillation via Factorization
In this paper, we study \xw{dataset distillation (DD)}, from a novel perspective and introduce a \emph{dataset factorization} approach, termed \emph{HaBa}, which is a plug-and-play strategy portable to any existing DD baseline.
Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems
Natural language generation (NLG) is a critical component of spoken dialogue and it has a significant impact both on usability and perceived quality.
Prosody Modifications for Question-Answering in Voice-Only Settings
Many popular form factors of digital assistants---such as Amazon Echo, Apple Homepod, or Google Home---enable the user to hold a conversation with these systems based only on the speech modality.