Entropy Minimisation Framework for Event-based Vision Model Estimation

ECCV 2020  ·  Urbano Miguel Nunes, Yiannis Demiris ·

We propose a novel EMin framework for event-based vision model estimation. The framework extends previous event-based motion compensation algorithms to handle models whose outputs have arbitrary dimensions. The main motivation comes from estimating motion from events directly in 3D space (e. g. events augmented with depth), without projecting them onto an image plane. This is achieved by modelling the event alignment according to candidate parameters and minimising the resultant dispersion. We provide a family of suitable entropy loss functions and an efficient approximation whose complexity is only linear with the number of events (e. g. the complexity does not depend on the number of image pixels). The framework is evaluated on several motion estimation problems, including optical flow and rotational motion. As proof of concept, we also test our framework on 6-DOF estimation by performing the optimisation directly in 3D space.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here