Gaze Prediction
13 papers with code • 0 benchmarks • 5 datasets
Benchmarks
These leaderboards are used to track progress in Gaze Prediction
Latest papers
SCOUT+: Towards Practical Task-Driven Drivers' Gaze Prediction
In this paper, we address the challenge of effective modeling of task and context with common sources of data for use in practical systems.
Data Limitations for Modeling Top-Down Effects on Drivers' Attention
The crux of the problem is lack of public data with annotations that could be used to train top-down models and evaluate how well models of any kind capture effects of task on attention.
Understanding and Modeling the Effects of Task and Context on Drivers' Gaze Allocation
Therefore, to enable analysis and modeling of these factors for drivers' gaze prediction, we propose the following: 1) we correct the data processing pipeline used in DR(eye)VE to reduce noise in the recorded gaze data; 2) we then add per-frame labels for driving task and context; 3) we benchmark a number of baseline and SOTA models for saliency and driver gaze prediction and use new annotations to analyze how their performance changes in scenarios involving different tasks; and, lastly, 4) we develop a novel model that modulates drivers' gaze prediction with explicit action and context information.
Gazeformer: Scalable, Effective and Fast Prediction of Goal-Directed Human Attention
In response, we pose a new task called ZeroGaze, a new variant of zero-shot learning where gaze is predicted for never-before-searched objects, and we develop a novel model, Gazeformer, to solve the ZeroGaze problem.
Gazing at Social Interactions Between Foraging and Decision Theory
Finding the underlying principles of social attention in humans seems to be essential for the design of the interaction between natural and artificial agents.
L2CS-Net: Fine-Grained Gaze Estimation in Unconstrained Environments
In this paper, we propose a robust CNN-based model for predicting gaze in unconstrained settings.
GaTector: A Unified Framework for Gaze Object Prediction
In this paper, we build a novel framework named GaTector to tackle the gaze object prediction problem in a unified way.
EEGEyeNet: a Simultaneous Electroencephalography and Eye-tracking Dataset and Benchmark for Eye Movement Prediction
We present a new dataset and benchmark with the goal of advancing research in the intersection of brain activities and eye movements.
The Story in Your Eyes: An Individual-difference-aware Model for Cross-person Gaze Estimation
We propose a novel method on refining cross-person gaze prediction task with eye/face images only by explicitly modelling the person-specific differences.
Predicting Gaze in Egocentric Video by Learning Task-dependent Attention Transition
We present a new computational model for gaze prediction in egocentric videos by exploring patterns in temporal shift of gaze fixations (attention transition) that are dependent on egocentric manipulation tasks.