Search Results for author: Qing Yan

Found 8 papers, 4 papers with code

MoMA: Multimodal LLM Adapter for Fast Personalized Image Generation

1 code implementation8 Apr 2024 Kunpeng Song, Yizhe Zhu, Bingchen Liu, Qing Yan, Ahmed Elgammal, Xiao Yang

This approach effectively synergizes reference image and text prompt information to produce valuable image features, facilitating an image diffusion model.

Image-to-Image Translation Language Modelling +1

Do We Really Need to Learn Representations from In-domain Data for Outlier Detection?

no code implementations19 May 2021 Zhisheng Xiao, Qing Yan, Yali Amit

Unsupervised outlier detection, which predicts if a test sample is an outlier or not using only the information from unlabelled inlier data, is an important but challenging task.

Outlier Detection Representation Learning

EBMs Trained with Maximum Likelihood are Generator Models Trained with a Self-adverserial Loss

no code implementations ICLR Workshop EBM 2021 Zhisheng Xiao, Qing Yan, Yali Amit

Doing so allows us to study the density induced by the dynamics (if the dynamics are invertible), and connect with GANs by treating the dynamics as generator models, the initial values as latent variables and the loss as optimizing a critic defined by the very same energy that determines the generator through its gradient.

Exponential Tilting of Generative Models: Improving Sample Quality by Training and Sampling from Latent Energy

no code implementations15 Jun 2020 Zhisheng Xiao, Qing Yan, Yali Amit

In this paper, we present a general method that can improve the sample quality of pre-trained likelihood based generative models.

Likelihood Regret: An Out-of-Distribution Detection Score For Variational Auto-encoder

2 code implementations NeurIPS 2020 Zhisheng Xiao, Qing Yan, Yali Amit

An important application of generative modeling should be the ability to detect out-of-distribution (OOD) samples by setting a threshold on the likelihood.

Out-of-Distribution Detection Out of Distribution (OOD) Detection

A Method to Model Conditional Distributions with Normalizing Flows

no code implementations5 Nov 2019 Zhisheng Xiao, Qing Yan, Yali Amit

In particular, we use our proposed method to analyze inverse problems with invertible neural networks by maximizing the posterior likelihood.

Generative Latent Flow

1 code implementation24 May 2019 Zhisheng Xiao, Qing Yan, Yali Amit

In this work, we propose the Generative Latent Flow (GLF), an algorithm for generative modeling of the data distribution.

Image Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.