Residual learning based densely connected deep dilated network for joint deblocking and super resolution

In many practical situations, images are not only down sampled but also compressed for efficient transmission and storage. JPEG and MPEG-2 compressions often introduce blocking artifacts because they process the data as 8 × 8 blocks. Many of the existing super resolution (SR) methods assume low resolution images as a down sampled version of high resolution (HR) image, and neglect the degradation due to compression. This exacerbates artifacts in the SR image and reduces the user experience. To address the joint deblocking and SR (DbSR), a novel deep network with dense skip connections and dilated convolutions is proposed in this paper, and we name it as DenseDbSR. Recently, many researchers have proposed deeper networks and achieved improvement in the SR performance. However, training deeper networks is very challenging because of the problem of vanishing gradients. Simply increasing the depth of the network leads to cumbersome computational costs. To enlarge the field-of-view (FOV) without increasing the computational cost, the dilated convolution is used. The dilated convolution exponentially expands the FOV and helps to exploit the contextual information efficiently. Moreover, the dense skip connections create short paths for gradients to be back-propagated efficiently and alleviates the problem of vanishing gradients. Furthermore, the network is relieved from the training burden by learning residuals of the SR image instead of learning raw images. From the conducted extensive experimentation, the proposed DenseDbSR network produced better performance in terms of PSNR and SSIM than the compared state-of-the-art methods.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here