As-projective-as-possible bias correction for illumination estimation algorithms

Illumination estimation is the key routine in a camera's onboard auto-white-balance (AWB) function. Illumination estimation algorithms estimate the color of the scene's illumination from an image in the form of an R,G,B vector in the sensor's raw-RGB color space. While learning-based methods have demonstrated impressive performance for illumination estimation, cameras still rely on simple statistical-based algorithms that are less accurate but capable of executing quickly on the camera's hardware. An effective strategy to improve the accuracy of these fast statistical-based algorithms is to apply a post-estimate bias correction function to transform the estimated R,G,B vector such that it lies closer to the correct solution. Recent work by Finlayson, Interface Focus, 2018 showed that a bias correction function can be formulated as a projective transform because the magnitude of the R,G,B illumination vector does not matter to the AWB procedure. This paper builds on this finding and shows that further improvements can be obtained by using an as-projective-as-possible (APAP) projective transform that locally adapts the projective transform to the input R,G,B vector. We demonstrate the effectiveness of the proposed APAP bias correction on several well-known statistical illumination estimation methods. We also describe a fast lookup method that allows the APAP transform to be performed with only a few lookup operations.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here