Learning from Multi-domain Artistic Images for Arbitrary Style Transfer

We propose a fast feed-forward network for arbitrary style transfer, which can generate stylized image for previously unseen content and style image pairs. Besides the traditional content and style representation based on deep features and statistics for textures, we use adversarial networks to regularize the generation of stylized images... (read more)

Results in Papers With Code
(↓ scroll down to see all results)