Procedural Text Understanding
6 papers with code • 1 benchmarks • 1 datasets
Most implemented papers
Tracking the World State with Recurrent Entity Networks
The EntNet sets a new state-of-the-art on the bAbI tasks, and is the first method to solve all the tasks in the 10k training examples setting.
Query-Reduction Networks for Question Answering
In this paper, we study the problem of question answering when reasoning over multiple facts is required.
Time-Stamped Language Model: Teaching Language Models to Understand the Flow of Events
This enables us to use pre-trained transformer-based language models on other QA benchmarks by adapting those to the procedural text understanding.
Coalescing Global and Local Information for Procedural Text Understanding
In this paper, we propose Coalescing Global and Local Information (CGLI), a new model that builds entity- and timestep-aware input representations (local input) considering the whole context (global input), and we jointly model the entity states with a structured prediction objective (global output).
CLMSM: A Multi-Task Learning Framework for Pre-training on Procedural Text
In this paper, we propose CLMSM, a domain-specific, continual pre-training framework, that learns from a large set of procedural recipes.
Order-Based Pre-training Strategies for Procedural Text Understanding
In this paper, we propose sequence-based pretraining methods to enhance procedural understanding in natural language processing.