Search Results for author: Lucas Brynte

Found 5 papers, 0 papers with code

Learning Structure-from-Motion with Graph Attention Networks

no code implementations30 Aug 2023 Lucas Brynte, José Pedro Iglesias, Carl Olsson, Fredrik Kahl

In this paper we tackle the problem of learning Structure-from-Motion (SfM) through the use of graph attention networks.

Graph Attention Pose Estimation

Rigidity Preserving Image Transformations and Equivariance in Perspective

no code implementations31 Jan 2022 Lucas Brynte, Georg Bökman, Axel Flinth, Fredrik Kahl

We characterize the class of image plane transformations which realize rigid camera motions and call these transformations `rigidity preserving'.

6D Pose Estimation using RGB Inductive Bias +1

On the Tightness of Semidefinite Relaxations for Rotation Estimation

no code implementations6 Jan 2021 Lucas Brynte, Viktor Larsson, José Pedro Iglesias, Carl Olsson, Fredrik Kahl

In studying the empirical performance we note that there are few failure cases reported in the literature, in particular for estimation problems with a single rotation, motivating us to gain further theoretical understanding.

Pose Proposal Critic: Robust Pose Refinement by Learning Reprojection Errors

no code implementations BMVC 2020 Lucas Brynte, Fredrik Kahl

In recent years, considerable progress has been made for the task of rigid object pose estimation from a single RGB-image, but achieving robustness to partial occlusions remains a challenging problem.

6D Pose Estimation using RGB

Semantic Match Consistency for Long-Term Visual Localization

no code implementations ECCV 2018 Carl Toft, Erik Stenborg, Lars Hammarstrand, Lucas Brynte, Marc Pollefeys, Torsten Sattler, Fredrik Kahl

Robust and accurate visual localization across large appearance variations due to changes in time of day, seasons, or changes of the environment is a challenging problem which is of importance to application areas such as navigation of autonomous robots.

Visual Localization

Cannot find the paper you are looking for? You can Submit a new open access paper.