295 papers with code • 0 benchmarks • 16 datasets
Style transfer is the task of changing the style of an image in one domain to the style of an image in another domain.
( Image credit: A Neural Algorithm of Artistic Style )
Image-to-image translation is a class of vision and graphics problems where the goal is to learn the mapping between an input image and an output image using a training set of aligned image pairs.
Ranked #1 on Image-to-Image Translation on photo2vangogh (Frechet Inception Distance metric)
In this paper, we present a method which combines the flexibility of the neural algorithm of artistic style with the speed of fast style transfer networks to allow real-time stylization using any content/style image pair.
Recently, with the revolutionary neural style transferring methods, creditable paintings can be synthesized automatically from content images and style images.
It this paper we revisit the fast stylization method introduced in Ulyanov et.
Inspired by the common painting process of drawing a draft and revising the details, we introduce a novel feed-forward method named Laplacian Pyramid Network (LapStyle).