Search Results for author: You Xie

Found 7 papers, 4 papers with code

TemporalUV: Capturing Loose Clothing with Temporally Coherent UV Coordinates

no code implementations CVPR 2022 You Xie, Huiqi Mao, Angela Yao, Nils Thuerey

We propose a novel approach to generate temporally coherent UV coordinates for loose clothing.

Reviving Autoencoder Pretraining

no code implementations1 Jan 2021 You Xie, Nils Thuerey

The pressing need for pretraining algorithms has been diminished by numerous advances in terms of regularization, architectures, and optimizers.

Data-driven Regularization via Racecar Training for Generalizing Neural Networks

1 code implementation30 Jun 2020 You Xie, Nils Thuerey

We propose a novel training approach for improving the generalization in neural networks.

Learning General and Reusable Features via Racecar-Training

no code implementations25 Sep 2019 You Xie, Nils Thuerey

We propose a novel training approach for improving the learning of generalizing features in neural networks.

A Multi-Pass GAN for Fluid Flow Super-Resolution

1 code implementation4 Jun 2019 Maximilian Werhahn, You Xie, Mengyu Chu, Nils Thuerey

We propose a novel method to up-sample volumetric functions with generative neural networks using several orthogonal passes.

Super-Resolution

Learning Temporal Coherence via Self-Supervision for GAN-based Video Generation

13 code implementations23 Nov 2018 Mengyu Chu, You Xie, Jonas Mayer, Laura Leal-Taixé, Nils Thuerey

Additionally, we propose a first set of metrics to quantitatively evaluate the accuracy as well as the perceptual quality of the temporal evolution.

Image Super-Resolution Motion Compensation +3

tempoGAN: A Temporally Coherent, Volumetric GAN for Super-resolution Fluid Flow

1 code implementation29 Jan 2018 You Xie, Erik Franz, Mengyu Chu, Nils Thuerey

We propose a temporally coherent generative model addressing the super-resolution problem for fluid flows.

Data Augmentation Super-Resolution

Cannot find the paper you are looking for? You can Submit a new open access paper.