Video Style Transfer

6 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

A Style-Aware Content Loss for Real-time HD Style Transfer

CompVis/adaptive-style-transfer ECCV 2018

These and our qualitative results ranging from small image patches to megapixel stylistic images and videos show that our approach better captures the subtle nature in which a style affects content.

ReCoNet: Real-time Coherent Video Style Transfer Network

safwankdb/ReCoNet-PyTorch 3 Jul 2018

Image style transfer models based on convolutional neural networks usually suffer from high temporal inconsistency when applied to videos.

AdaAttN: Revisit Attention Mechanism in Arbitrary Neural Style Transfer

huage001/adaattn ICCV 2021

Finally, the content feature is normalized so that they demonstrate the same local feature statistics as the calculated per-point weighted style feature statistics.

Layered Neural Atlases for Consistent Video Editing

ykasten/layered-neural-atlases 23 Sep 2021

We present a method that decomposes, or "unwraps", an input video into a set of layered 2D atlases, each providing a unified representation of the appearance of an object (or background) over the video.

Creative Flow+ Dataset

creativefloworg/creativeflow CVPR 2019

We present the Creative Flow+ Dataset, the first diverse multi-style artistic video dataset richly labeled with per-pixel optical flow, occlusions, correspondences, segmentation labels, normals, and depth.

Consistent Video Style Transfer via Relaxation and Regularization

daooshee/ReReVST-Code 23 Sep 2020

In this article, we address the problem by jointly considering the intrinsic properties of stylization and temporal consistency.