Style Transfer

644 papers with code • 2 benchmarks • 17 datasets

Style Transfer is a technique in computer vision and graphics that involves generating a new image by combining the content of one image with the style of another image. The goal of style transfer is to create an image that preserves the content of the original image while applying the visual style of another image.

( Image credit: A Neural Algorithm of Artistic Style )

Libraries

Use these libraries to find Style Transfer models and implementations

Most implemented papers

A Neural Algorithm of Artistic Style

jcjohnson/neural-style 26 Aug 2015

In fine art, especially painting, humans have mastered the skill to create unique visual experiences through composing a complex interplay between the content and style of an image.

Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks

junyanz/pytorch-CycleGAN-and-pix2pix ICCV 2017

Image-to-image translation is a class of vision and graphics problems where the goal is to learn the mapping between an input image and an output image using a training set of aligned image pairs.

Perceptual Losses for Real-Time Style Transfer and Super-Resolution

alexjc/neural-enhance 27 Mar 2016

We consider image transformation problems, where an input image is transformed into an output image.

Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization

xunhuang1995/AdaIN-style ICCV 2017

Gatys et al. recently introduced a neural algorithm that renders a content image in the style of another image, achieving so-called style transfer.

Instance Normalization: The Missing Ingredient for Fast Stylization

DmitryUlyanov/texture_nets 27 Jul 2016

It this paper we revisit the fast stylization method introduced in Ulyanov et.

Deep Photo Style Transfer

luanfujun/deep-photo-styletransfer CVPR 2017

This paper introduces a deep-learning approach to photographic style transfer that handles a large variety of image content while faithfully transferring the reference style.

Exploring the structure of a real-time, arbitrary neural artistic stylization network

magenta/magenta 18 May 2017

In this paper, we present a method which combines the flexibility of the neural algorithm of artistic style with the speed of fast style transfer networks to allow real-time stylization using any content/style image pair.

Universal Style Transfer via Feature Transforms

Yijunmaverick/UniversalStyleTransfer NeurIPS 2017

The whitening and coloring transforms reflect a direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer.

Deep Feature Consistent Variational Autoencoder

AntixK/PyTorch-VAE 2 Oct 2016

We present a novel method for constructing Variational Autoencoder (VAE).

Style Transfer from Non-Parallel Text by Cross-Alignment

shentianxiao/language-style-transfer NeurIPS 2017

We demonstrate the effectiveness of this cross-alignment method on three tasks: sentiment modification, decipherment of word substitution ciphers, and recovery of word order.