no code implementations • 19 Sep 2022 • Zhisheng Xiao, Tian Han
Instead, we propose to use noise contrastive estimation (NCE) to discriminatively learn the EBM through density ratio estimation between the latent prior density and latent posterior density.
5 code implementations • ICLR 2022 • Zhisheng Xiao, Karsten Kreis, Arash Vahdat
To the best of our knowledge, denoising diffusion GAN is the first model that reduces sampling cost in diffusion models to an extent that allows them to be applied to real-world applications inexpensively.
Ranked #8 on
Image Generation
on CelebA-HQ 256x256
no code implementations • 19 May 2021 • Zhisheng Xiao, Qing Yan, Yali Amit
Unsupervised outlier detection, which predicts if a test sample is an outlier or not using only the information from unlabelled inlier data, is an important but challenging task.
no code implementations • ICLR Workshop EBM 2021 • Zhisheng Xiao, Qing Yan, Yali Amit
Doing so allows us to study the density induced by the dynamics (if the dynamics are invertible), and connect with GANs by treating the dynamics as generator models, the initial values as latent variables and the loss as optimizing a critic defined by the very same energy that determines the generator through its gradient.
4 code implementations • 31 Oct 2020 • Huajie Shao, Zhisheng Xiao, Shuochao Yao, Aston Zhang, Shengzhong Liu, Tarek Abdelzaher
ControlVAE is a new variational autoencoder (VAE) framework that combines the automatic control theory with the basic VAE to stabilize the KL-divergence of VAE models to a specified value.
1 code implementation • ICLR 2021 • Zhisheng Xiao, Karsten Kreis, Jan Kautz, Arash Vahdat
VAEBM captures the overall mode structure of the data distribution using a state-of-the-art VAE and it relies on its EBM component to explicitly exclude non-data-like regions from the model and refine the image samples.
Ranked #1 on
Image Generation
on Stacked MNIST
no code implementations • 15 Jun 2020 • Zhisheng Xiao, Qing Yan, Yali Amit
In this paper, we present a general method that can improve the sample quality of pre-trained likelihood based generative models.
3 code implementations • NeurIPS 2020 • Zhisheng Xiao, Qing Yan, Yali Amit
An important application of generative modeling should be the ability to detect out-of-distribution (OOD) samples by setting a threshold on the likelihood.
Out-of-Distribution Detection
Out of Distribution (OOD) Detection
no code implementations • 5 Nov 2019 • Zhisheng Xiao, Qing Yan, Yali Amit
In particular, we use our proposed method to analyze inverse problems with invertible neural networks by maximizing the posterior likelihood.
no code implementations • 10 Oct 2019 • Peijun Xiao, Zhisheng Xiao, Ruoyu Sun
Recently, Coordinate Descent (CD) with cyclic order was shown to be $O(n^2)$ times slower than randomized versions in the worst-case.
1 code implementation • 24 May 2019 • Zhisheng Xiao, Qing Yan, Yali Amit
In this work, we propose the Generative Latent Flow (GLF), an algorithm for generative modeling of the data distribution.
Ranked #1 on
Image Generation
on Fashion-MNIST