Recently, with the revolutionary neural style transferring methods, creditable paintings can be synthesized automatically from content images and style images.
The system directly maps a grayscale image, along with sparse, local user "hints" to an output colorization with a Convolutional Neural Network (CNN).
We review some of the most recent approaches to colorize gray-scale images using deep learning methods.
This intermediate output can be used to automatically generate a color image, or further manipulated prior to image formation.
A GAN approach is proposed, called Tag2Pix, of line art colorization which takes as input a grayscale line art and color tag information and produces a quality colored image.
Over the last decade, the process of automatic image colorization has been of significant interest for several application areas including restoration of aged or degraded images.
We demonstrate these properties for the tasks of MNIST digit generation and image colorization.
By feeding the color features to our deep colorization network, we accomplish colorization of the entire manga using the desired colors for each panel.