Perceptual Loss for Robust Unsupervised Homography Estimation

20 Apr 2021  ·  Daniel Koguciuk, Elahe Arani, Bahram Zonooz ·

Homography estimation is often an indispensable step in many computer vision tasks. The existing approaches, however, are not robust to illumination and/or larger viewpoint changes. In this paper, we propose bidirectional implicit Homography Estimation (biHomE) loss for unsupervised homography estimation. biHomE minimizes the distance in the feature space between the warped image from the source viewpoint and the corresponding image from the target viewpoint. Since we use a fixed pre-trained feature extractor and the only learnable component of our framework is the homography network, we effectively decouple the homography estimation from representation learning. We use an additional photometric distortion step in the synthetic COCO dataset generation to better represent the illumination variation of the real-world scenarios. We show that biHomE achieves state-of-the-art performance on synthetic COCO dataset, which is also comparable or better compared to supervised approaches. Furthermore, the empirical results demonstrate the robustness of our approach to illumination variation compared to existing methods.

PDF Abstract

Datasets


Introduced in the Paper:

PDS-COCO

Used in the Paper:

MS COCO S-COCO

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Homography Estimation PDS-COCO PFNet+biHomE MACE 2.11 # 1
Homography Estimation S-COCO PFNet+biHomE MACE 1.79 # 2

Methods


No methods listed for this paper. Add relevant methods here