Enhance Convolutional Neural Networks with Noise Incentive Block

9 Dec 2020  ·  Menghan Xia, Yi Wang, Chu Han, Tien-Tsin Wong ·

As a generic modeling tool, Convolutional Neural Networks (CNNs) have been widely employed in image generation and translation tasks. However, when fed with a flat input, current CNN models may fail to generate vivid results due to the spatially shared convolution kernels. We call it the flatness degradation of CNNs. Unfortunately, such degradation is the greatest obstacles to generate a spatially-variant output from a flat input, which has been barely discussed in the previous literature. To tackle this problem, we propose a model agnostic solution, i.e. Noise Incentive Block (NIB), which serves as a generic plug-in for any CNN generation model. The key idea is to break the flat input condition while keeping the intactness of the original information. Specifically, the NIB perturbs the input data symmetrically with a noise map and reassembles them in the feature domain as driven by the objective function. Extensive experiments show that existing CNN models equipped with NIB survive from the flatness degradation and are able to generate visually better results with richer details in some specific image generation tasks given flat inputs, e.g. semantic image synthesis, data-hidden image generation, and deep neural dithering.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods