Search Results for author: Zhichun Huang

Found 8 papers, 2 papers with code

Variational Sampling of Temporal Trajectories

no code implementations18 Mar 2024 Jurijs Nazarovs, Zhichun Huang, Xingjian Zhen, Sourav Pal, Rudrasis Chakraborty, Vikas Singh

In this work, we introduce a mechanism to learn the distribution of trajectories by parameterizing the transition function $f$ explicitly as an element in a function space.

Out-of-Distribution Detection

Image2Gif: Generating Continuous Realistic Animations with Warping NODEs

1 code implementation9 May 2022 Jurijs Nazarovs, Zhichun Huang

Generating smooth animations from a limited number of sequential observations has a number of applications in vision.

Generative Adversarial Network Video Frame Interpolation

Understanding Uncertainty Maps in Vision With Statistical Testing

no code implementations CVPR 2022 Jurijs Nazarovs, Zhichun Huang, Songwong Tasneeyapant, Rudrasis Chakraborty, Vikas Singh

Quantitative descriptions of confidence intervals and uncertainties of the predictions of a model are needed in many applications in vision and machine learning.

Forward Operator Estimation in Generative Models with Kernel Transfer Operators

no code implementations1 Dec 2021 Zhichun Huang, Rudrasis Chakraborty, Vikas Singh

Generative models which use explicit density modeling (e. g., variational autoencoders, flow-based generative models) involve finding a mapping from a known distribution, e. g. Gaussian, to the unknown input distribution.

$(\textrm{Implicit})^2$: Implicit Layers for Implicit Representations

no code implementations NeurIPS 2021 Zhichun Huang, Shaojie Bai, J. Zico Kolter

Recent research in deep learning has investigated two very different forms of ''implicitness'': implicit representations model high-frequency data such as images or 3D shapes directly via a low-dimensional neural network (often using e. g., sinusoidal bases or nonlinearities); implicit layers, in contrast, refer to techniques where the forward pass of a network is computed via non-linear dynamical systems, such as fixed-point or differential equation solutions, with the backward pass computed via the implicit function theorem.

Distribution Matching in Deep Generative Models with Kernel Transfer Operators

no code implementations29 Sep 2021 Zhichun Huang, Rudrasis Chakraborty, Vikas Singh

Generative models which use explicit density modeling (e. g., variational autoencoders, flow-based generative models) involve finding a mapping from a known distribution, e. g. Gaussian, to the unknown input distribution.

Can Kernel Transfer Operators Help Flow based Generative Models?

no code implementations1 Jan 2021 Zhichun Huang, Rudrasis Chakraborty, Xingjian Zhen, Vikas Singh

Flow-based generative models refer to deep generative models with tractable likelihoods, and offer several attractive properties including efficient density estimation and sampling.

Density Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.