Slot Filling
139 papers with code • 14 benchmarks • 26 datasets
The goal of Slot Filling is to identify from a running dialog different slots, which correspond to different parameters of the user’s query. For instance, when a user queries for nearby restaurants, key slots for location and preferred food are required for a dialog system to retrieve the appropriate information. Thus, the main challenge in the slot-filling task is to extract the target entity.
Source: Real-time On-Demand Crowd-powered Entity Extraction
Image credit: Robust Retrieval Augmented Generation for Zero-shot Slot Filling
Libraries
Use these libraries to find Slot Filling models and implementationsDatasets
Most implemented papers
BERT for Joint Intent Classification and Slot Filling
Intent classification and slot filling are two essential tasks for natural language understanding.
Learning End-to-End Goal-Oriented Dialog
We show similar result patterns on data extracted from an online concierge service.
Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling
Attention-based encoder-decoder neural network models have recently shown promising results in machine translation and speech recognition.
MASSIVE: A 1M-Example Multilingual Natural Language Understanding Dataset with 51 Typologically-Diverse Languages
We present the MASSIVE dataset--Multilingual Amazon Slu resource package (SLURP) for Slot-filling, Intent classification, and Virtual assistant Evaluation.
Data Programming: Creating Large Training Sets, Quickly
Additionally, in initial user studies we observed that data programming may be an easier way for non-experts to create machine learning models when training data is limited or unavailable.
Towards Scalable Multi-domain Conversational Agents: The Schema-Guided Dialogue Dataset
In this work, we introduce the the Schema-Guided Dialogue (SGD) dataset, containing over 16k multi-domain conversations spanning 16 domains.
Learning Dense Representations of Phrases at Scale
Open-domain question answering can be reformulated as a phrase retrieval problem, without the need for processing documents on-demand during inference (Seo et al., 2019).
Efficient Sequence Transduction by Jointly Predicting Tokens and Durations
TDT models for Speech Recognition achieve better accuracy and up to 2. 82X faster inference than conventional Transducers.
Joint Slot Filling and Intent Detection via Capsule Neural Networks
Being able to recognize words as slots and detect the intent of an utterance has been a keen issue in natural language understanding.
KILT: a Benchmark for Knowledge Intensive Language Tasks
We test both task-specific and general baselines, evaluating downstream performance in addition to the ability of the models to provide provenance.