Embodied Question Answering

11 papers with code • 0 benchmarks • 2 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Embodied Question Answering

facebookresearch/House3D CVPR 2018

We present a new AI task -- Embodied Question Answering (EmbodiedQA) -- where an agent is spawned at a random location in a 3D environment and asked a question ("What color is the car?").

Neural Modular Control for Embodied Question Answering

facebookresearch/House3D 26 Oct 2018

We use imitation learning to warm-start policies at each level of the hierarchy, dramatically increasing sample efficiency, followed by reinforcement learning.

Towards Learning a Generalist Model for Embodied Navigation

zd11024/NaviLLM CVPR 2024

We conduct extensive experiments to evaluate the performance and generalizability of our model.

Blindfold Baselines for Embodied QA

ankeshanand/blindfold-baselines-eqa 12 Nov 2018

We explore blindfold (question-only) baselines for Embodied Question Answering.

Multi-Target Embodied Question Answering

facebookresearch/EmbodiedQA CVPR 2019

To address this, we propose a modular architecture composed of a program generator, a controller, a navigator, and a VQA module.

VideoNavQA: Bridging the Gap between Visual and Embodied Question Answering

catalina17/VideoNavQA 14 Aug 2019

The goal of this dataset is to assess question-answering performance from nearly-ideal navigation paths, while considering a much more complete variety of questions than current instantiations of the EQA task.

AllenAct: A Framework for Embodied AI Research

allenai/allenact 28 Aug 2020

The domain of Embodied AI, in which agents learn to complete tasks through interaction with their environment from egocentric observations, has experienced substantial growth with the advent of deep reinforcement learning and increased interest from the computer vision, NLP, and robotics communities.

Synthesizing Event-centric Knowledge Graphs of Daily Activities Using Virtual Space

aistairc/virtualhome2kg 30 Jul 2023

Artificial intelligence (AI) is expected to be embodied in software agents, robots, and cyber-physical systems that can understand the various contextual information of daily life in the home environment to support human behavior and decision making in various situations.

MEIA: Multimodal Embodied Perception and Interaction in Unknown Environments

hcplab-sysu/causalvlr 1 Feb 2024

To overcome this limitation, we introduce the Multimodal Embodied Interactive Agent (MEIA), capable of translating high-level tasks expressed in natural language into a sequence of executable actions.

Map-based Modular Approach for Zero-shot Embodied Question Answering

ATR-DBI/Map-EQA 26 May 2024

Embodied Question Answering (EQA) serves as a benchmark task to evaluate the capability of robots to navigate within novel environments and identify objects in response to human queries.