HardGAN: A Haze-Aware Representation Distillation GAN for Single Image Dehazing

In this paper, we present a Haze-Aware Representation Distillation Generative Adversarial Network named HardGAN for single-image dehazing. Unlike previous studies that intend to model the transmission map and global atmospheric light jointly to restore a clear image, we solve this regression problem by a multi-scale structure neural network embedded with our proposed Haze-Aware Representation Distillation (HARD) layer. Moreover, we re-introduce to utilize the normalization layer skillfully instead of stacking with the convolution layer directly as before to avoid the useful information wash away, as claimed in many image quality enhancement studies. Extensive experiment on several synthetic benchmark datasets as well as the NTIRE 2020 real-world images show our proposed multi-layer GAN-based network with HARD performs favorably against the state-of-the-art methods in terms of PSNR, SSIM, LPIPS, and human subjective evaluation.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here