Event-based vision

53 papers with code • 1 benchmarks • 11 datasets

An event camera, also known as a neuromorphic camera, silicon retina or dynamic vision sensor, is an imaging sensor that responds to local changes in brightness. Event cameras do not capture images using a shutter as conventional cameras do. Instead, each pixel inside an event camera operates independently and asynchronously, reporting changes in brightness as they occur and staying silent otherwise. Modern event cameras have microsecond temporal resolution, 120 dB dynamic range, and less under/overexposure and motion blur than frame cameras.

Libraries

Use these libraries to find Event-based vision models and implementations

Most implemented papers

Event-based Camera Pose Tracking using a Generative Event Model

uzh-rpg/rpg_image_reconstruction_from_events 7 Oct 2015

Event-based vision sensors mimic the operation of biological retina and they represent a major paradigm shift from traditional cameras.

Person Re-Identification without Identification via Event Anonymization

IIT-PAVIS/ReId_without_Id ICCV 2023

In this work, we also bring to the community the first ever event-based person ReId dataset gathered to evaluate the performance of our approach.

GET: Group Event Transformer for Event-Based Vision

peterande/get-group-event-transformer ICCV 2023

Event cameras are a type of novel neuromorphic sen-sor that has been gaining increasing attention.

Event-based Background-Oriented Schlieren

uzh-rpg/event-based_vision_resources 1 Nov 2023

Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding.

State Space Models for Event Cameras

uzh-rpg/ssms_event_cameras CVPR 2024

We address this challenge by introducing state-space models (SSMs) with learnable timescale parameters to event-based vision.

eCARLA-scenes: A synthetically generated dataset for event-based optical flow prediction

CIRS-Girona/ecarla-scenes 12 Dec 2024

We further present a synthetic event-based datasets and data generation pipelines for optical flow prediction tasks.

Unsupervised Learning of a Hierarchical Spiking Neural Network for Optical Flow Estimation: From Events to Global Motion Perception

tudelft/cuSNN 28 Jul 2018

Convolutional layers with input synapses characterized by single and multiple transmission delays are employed for feature and local motion perception, respectively; while global motion selectivity emerges in a final fully-connected layer.

Focus Is All You Need: Loss Functions For Event-based Vision

tub-rip/dvs_global_flow_skeleton CVPR 2019

The proposed loss functions allow bringing mature computer vision tools to the realm of event cameras.

Event-based Vision: A Survey

uzh-rpg/event-based_vision_resources 17 Apr 2019

Event cameras offer attractive properties compared to traditional cameras: high temporal resolution (in the order of microseconds), very high dynamic range (140 dB vs. 60 dB), low power consumption, and high pixel bandwidth (on the order of kHz) resulting in reduced motion blur.

Event Cameras, Contrast Maximization and Reward Functions: An Analysis

TimoStoff/events_contrast_maximization CVPR 2019

The versatility of this approach has lead to a flurry of research in recent years, but no in-depth study of the reward chosen during optimization has yet been made.