Search Results for author: Chongxuan Li

Found 35 papers, 25 papers with code

Understanding and Stabilizing GANs' Training Dynamics Using Control Theory

no code implementations ICML 2020 Kun Xu, Chongxuan Li, Jun Zhu, Bo Zhang

There are existing efforts that model the training dynamics of GANs in the parameter space but the analysis cannot directly motivate practically effective stabilizing methods.

DPM-Solver++: Fast Solver for Guided Sampling of Diffusion Probabilistic Models

1 code implementation2 Nov 2022 Cheng Lu, Yuhao Zhou, Fan Bao, Jianfei Chen, Chongxuan Li, Jun Zhu

The commonly-used fast sampler for guided sampling is DDIM, a first-order diffusion ODE solver that generally needs 100 to 250 steps for high-quality samples.

Text to image generation Text-to-Image Generation

Equivariant Energy-Guided SDE for Inverse Molecular Design

1 code implementation30 Sep 2022 Fan Bao, Min Zhao, Zhongkai Hao, Peiyao Li, Chongxuan Li, Jun Zhu

Inverse molecular design is critical in material science and drug discovery, where the generated molecules should satisfy certain desirable properties.

Drug Discovery

All are Worth Words: A ViT Backbone for Diffusion Models

2 code implementations25 Sep 2022 Fan Bao, Shen Nie, Kaiwen Xue, Yue Cao, Chongxuan Li, Hang Su, Jun Zhu

In particular, a latent diffusion model with a small U-ViT achieves a record-breaking FID of 5. 48 in text-to-image generation on MS-COCO, among methods without accessing large external datasets during the training of generative models.

Conditional Image Generation Text to image generation +1

Deep Generative Modeling on Limited Data with Regularization by Nontransferable Pre-trained Models

no code implementations30 Aug 2022 Yong Zhong, Hongtao Liu, Xiaodong Liu, Fan Bao, Weiran Shen, Chongxuan Li

Deep generative models (DGMs) are data-eager because learning a complex model on limited data suffers from a large variance and easily overfits.

EGSDE: Unpaired Image-to-Image Translation via Energy-Guided Stochastic Differential Equations

1 code implementation14 Jul 2022 Min Zhao, Fan Bao, Chongxuan Li, Jun Zhu

Further, we provide an alternative explanation of the EGSDE as a product of experts, where each of the three experts (corresponding to the SDE and two feature extractors) solely contributes to faithfulness or realism.

Image-to-Image Translation Translation

Fast Lossless Neural Compression with Integer-Only Discrete Flows

1 code implementation17 Jun 2022 Siyu Wang, Jianfei Chen, Chongxuan Li, Jun Zhu, Bo Zhang

In this work, we propose Integer-only Discrete Flows (IODF), an efficient neural compressor with integer-only arithmetic.

Quantization

Maximum Likelihood Training for Score-Based Diffusion ODEs by High-Order Denoising Score Matching

1 code implementation16 Jun 2022 Cheng Lu, Kaiwen Zheng, Fan Bao, Jianfei Chen, Chongxuan Li, Jun Zhu

To fill up this gap, we show that the negative likelihood of the ODE can be bounded by controlling the first, second, and third-order score matching errors; and we further present a novel high-order denoising score matching method to enable maximum likelihood training of score-based diffusion ODEs.

Denoising

Estimating the Optimal Covariance with Imperfect Mean in Diffusion Probabilistic Models

1 code implementation15 Jun 2022 Fan Bao, Chongxuan Li, Jiacheng Sun, Jun Zhu, Bo Zhang

Thus, the generation performance on a subset of timesteps is crucial, which is greatly influenced by the covariance design in DPMs.

Memory Replay with Data Compression for Continual Learning

1 code implementation ICLR 2022 Liyuan Wang, Xingxing Zhang, Kuo Yang, Longhui Yu, Chongxuan Li, Lanqing Hong, Shifeng Zhang, Zhenguo Li, Yi Zhong, Jun Zhu

In this work, we propose memory replay with data compression (MRDC) to reduce the storage cost of old training samples and thus increase their amount that can be stored in the memory buffer.

Autonomous Driving class-incremental learning +5

Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in Diffusion Probabilistic Models

2 code implementations ICLR 2022 Fan Bao, Chongxuan Li, Jun Zhu, Bo Zhang

In this work, we present a surprising result that both the optimal reverse variance and the corresponding optimal KL divergence of a DPM have analytic forms w. r. t.

Stability and Generalization of Bilevel Programming in Hyperparameter Optimization

1 code implementation NeurIPS 2021 Fan Bao, Guoqiang Wu, Chongxuan Li, Jun Zhu, Bo Zhang

Our results can explain some mysterious behaviours of the bilevel programming in practice, for instance, overfitting to the validation set.

Hyperparameter Optimization

Rethinking and Reweighting the Univariate Losses for Multi-Label Ranking: Consistency and Generalization

no code implementations NeurIPS 2021 Guoqiang Wu, Chongxuan Li, Kun Xu, Jun Zhu

Our results show that learning algorithms with the consistent univariate loss have an error bound of $O(c)$ ($c$ is the number of labels), while algorithms with the inconsistent pairwise loss depend on $O(\sqrt{c})$ as shown in prior work.

Multi-Label Classification

MiCE: Mixture of Contrastive Experts for Unsupervised Image Clustering

1 code implementation ICLR 2021 Tsung Wei Tsai, Chongxuan Li, Jun Zhu

We present Mixture of Contrastive Experts (MiCE), a unified probabilistic clustering framework that simultaneously exploits the discriminative representations learned by contrastive learning and the semantic structures captured by a latent mixture model.

Contrastive Learning Image Clustering

Implicit Normalizing Flows

1 code implementation ICLR 2021 Cheng Lu, Jianfei Chen, Chongxuan Li, Qiuhao Wang, Jun Zhu

Through theoretical analysis, we show that the function space of ImpFlow is strictly richer than that of ResFlows.

Relaxed Conditional Image Transfer for Semi-supervised Domain Adaptation

no code implementations5 Jan 2021 Qijun Luo, Zhili Liu, Lanqing Hong, Chongxuan Li, Kuo Yang, Liyuan Wang, Fengwei Zhou, Guilin Li, Zhenguo Li, Jun Zhu

Semi-supervised domain adaptation (SSDA), which aims to learn models in a partially labeled target domain with the assistance of the fully labeled source domain, attracts increasing attention in recent years.

Domain Adaptation

Variational (Gradient) Estimate of the Score Function in Energy-based Latent Variable Models

1 code implementation NeurIPS Workshop ICBINB 2020 Fan Bao, Kun Xu, Chongxuan Li, Lanqing Hong, Jun Zhu, Bo Zhang

The learning and evaluation of energy-based latent variable models (EBLVMs) without any structural assumptions are highly challenging, because the true posteriors and the partition functions in such models are generally intractable.

Bi-level Score Matching for Learning Energy-based Latent Variable Models

1 code implementation NeurIPS 2020 Fan Bao, Chongxuan Li, Kun Xu, Hang Su, Jun Zhu, Bo Zhang

This paper presents a bi-level score matching (BiSM) method to learn EBLVMs with general structures by reformulating SM as a bi-level optimization problem.

Stochastic Optimization

Efficient Learning of Generative Models via Finite-Difference Score Matching

1 code implementation NeurIPS 2020 Tianyu Pang, Kun Xu, Chongxuan Li, Yang song, Stefano Ermon, Jun Zhu

Several machine learning applications involve the optimization of higher-order derivatives (e. g., gradients of gradients) during training, which can be expensive in respect to memory and computation even with automatic differentiation.

Triple Generative Adversarial Networks

1 code implementation20 Dec 2019 Chongxuan Li, Kun Xu, Jiashuo Liu, Jun Zhu, Bo Zhang

It is formulated as a three-player minimax game consisting of a generator, a classifier and a discriminator, and therefore is referred to as Triple Generative Adversarial Network (Triple-GAN).

Classification Conditional Image Generation +3

Understanding and Stabilizing GANs' Training Dynamics with Control Theory

1 code implementation29 Sep 2019 Kun Xu, Chongxuan Li, Jun Zhu, Bo Zhang

There are existing efforts that model the training dynamics of GANs in the parameter space but the analysis cannot directly motivate practically effective stabilizing methods.

Ranked #32 on Image Generation on CIFAR-10 (Inception score metric)

Image Generation L2 Regularization

Countering Noisy Labels By Learning From Auxiliary Clean Labels

no code implementations23 May 2019 Tsung Wei Tsai, Chongxuan Li, Jun Zhu

We consider the learning from noisy labels (NL) problem which emerges in many real-world applications.

To Relieve Your Headache of Training an MRF, Take AdVIL

no code implementations ICLR 2020 Chongxuan Li, Chao Du, Kun Xu, Max Welling, Jun Zhu, Bo Zhang

We propose a black-box algorithm called {\it Adversarial Variational Inference and Learning} (AdVIL) to perform inference and learning on a general Markov random field (MRF).

Variational Inference

Graphical Generative Adversarial Networks

1 code implementation NeurIPS 2018 Chongxuan Li, Max Welling, Jun Zhu, Bo Zhang

We propose Graphical Generative Adversarial Networks (Graphical-GAN) to model structured data.

Population Matching Discrepancy and Applications in Deep Learning

no code implementations NeurIPS 2017 Jianfei Chen, Chongxuan Li, Yizhong Ru, Jun Zhu

In this paper, we propose population matching discrepancy (PMD) for estimating the distribution distance based on samples, as well as an algorithm to learn the parameters of the distributions using PMD as an objective.

Domain Adaptation

Triple Generative Adversarial Nets

1 code implementation NeurIPS 2017 Chongxuan Li, Kun Xu, Jun Zhu, Bo Zhang

Generative Adversarial Nets (GANs) have shown promise in image generation and semi-supervised learning (SSL).

Image Generation

Max-Margin Deep Generative Models for (Semi-)Supervised Learning

1 code implementation22 Nov 2016 Chongxuan Li, Jun Zhu, Bo Zhang

Deep generative models (DGMs) are effective on learning multilayered representations of complex data and performing inference of input data by exploring the generative ability.

Towards Better Analysis of Deep Convolutional Neural Networks

no code implementations24 Apr 2016 Mengchen Liu, Jiaxin Shi, Zhen Li, Chongxuan Li, Jun Zhu, Shixia Liu

Deep convolutional neural networks (CNNs) have achieved breakthrough performance in many pattern recognition tasks such as image classification.

Image Classification

Learning to Generate with Memory

1 code implementation24 Feb 2016 Chongxuan Li, Jun Zhu, Bo Zhang

Memory units have been widely used to enrich the capabilities of deep networks on capturing long-term dependencies in reasoning and prediction tasks, but little investigation exists on deep generative models (DGMs) which are good at inferring high-level invariant representations from unlabeled data.

Density Estimation Image Generation +2

Max-margin Deep Generative Models

2 code implementations NeurIPS 2015 Chongxuan Li, Jun Zhu, Tianlin Shi, Bo Zhang

Deep generative models (DGMs) are effective on learning multilayered representations of complex data and performing inference of input data by exploring the generative ability.

Cannot find the paper you are looking for? You can Submit a new open access paper.