Multi-hop Question Answering
56 papers with code • 2 benchmarks • 4 datasets
Libraries
Use these libraries to find Multi-hop Question Answering models and implementationsMost implemented papers
Can Language Models Solve Graph Problems in Natural Language?
We then propose Build-a-Graph Prompting and Algorithmic Prompting, two instruction-based approaches to enhance LLMs in solving natural language graph problems.
MQuAKE: Assessing Knowledge Editing in Language Models via Multi-Hop Questions
The information stored in large language models (LLMs) falls out of date quickly, and retraining from scratch is often not an option.
End-to-End Beam Retrieval for Multi-Hop Question Answering
This approach models the multi-hop retrieval process in an end-to-end manner by jointly optimizing an encoder and two classification heads across all hops.
HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering
Existing question answering (QA) datasets fail to train QA systems to perform complex reasoning and provide explanations for answers.
Avoiding Reasoning Shortcuts: Adversarial Evaluation, Training, and Model Development for Multi-Hop QA
After adversarial training, the baseline's performance improves but is still limited on the adversarial evaluation.
What's Missing: A Knowledge Gap Guided Approach for Multi-hop Question Answering
We propose jointly training a model to simultaneously fill this knowledge gap and compose it with the provided partial knowledge.
QASC: A Dataset for Question Answering via Sentence Composition
Guided by these annotations, we present a two-step approach to mitigate the retrieval challenges.
Learning from Explanations with Neural Execution Tree
While deep neural networks have achieved impressive performance on a range of NLP tasks, these data-hungry models heavily rely on labeled data, which restricts their applications in scenarios where data annotation is expensive.
Hierarchical Graph Network for Multi-hop Question Answering
In this paper, we present Hierarchical Graph Network (HGN) for multi-hop question answering.
Transformer-XH: Multi-Evidence Reasoning with eXtra Hop Attention
Transformers have achieved new heights modeling natural language as a sequence of text tokens.