Search Results for author: Yilun Xu

Found 20 papers, 12 papers with code

QuantumLeak: Stealing Quantum Neural Networks from Cloud-based NISQ Machines

no code implementations16 Mar 2024 Zhenxiao Fu, Min Yang, Cheng Chu, Yilun Xu, Gang Huang, Fan Chen

Variational quantum circuits (VQCs) have become a powerful tool for implementing Quantum Neural Networks (QNNs), addressing a wide range of complex problems.

New Facts and Data about Professors and their Research

no code implementations3 Dec 2023 Kyle R. Myers, Wei Yang Tham, Jerry Thursby, Marie Thursby, Nina Cohodes, Karim Lakhani, Rachel Mural, Yilun Xu

We introduce a new survey of professors at roughly 150 of the most research-intensive institutions of higher education in the US.

Particle Guidance: non-I.I.D. Diverse Sampling with Diffusion Models

1 code implementation19 Oct 2023 Gabriele Corso, Yilun Xu, Valentin De Bortoli, Regina Barzilay, Tommi Jaakkola

In light of the widespread success of generative models, a significant amount of research has gone into speeding up their sampling time.

Conditional Image Generation

Restart Sampling for Improving Generative Processes

1 code implementation NeurIPS 2023 Yilun Xu, Mingyang Deng, Xiang Cheng, Yonglong Tian, Ziming Liu, Tommi Jaakkola

Restart not only outperforms the previous best SDE results, but also accelerates the sampling speed by 10-fold / 2-fold on CIFAR-10 / ImageNet $64 \times 64$.

Attribute

GenPhys: From Physical Processes to Generative Models

no code implementations5 Apr 2023 Ziming Liu, Di Luo, Yilun Xu, Tommi Jaakkola, Max Tegmark

We introduce a general family, Generative Models from Physical Processes (GenPhys), where we translate partial differential equations (PDEs) describing physical processes to generative models.

Stable Target Field for Reduced Variance Score Estimation in Diffusion Models

1 code implementation1 Feb 2023 Yilun Xu, Shangyuan Tong, Tommi Jaakkola

We show that the procedure indeed helps in the challenging intermediate regime by reducing (the trace of) the covariance of training targets.

Denoising Image Generation

Poisson Flow Generative Models

1 code implementation22 Sep 2022 Yilun Xu, Ziming Liu, Max Tegmark, Tommi Jaakkola

We interpret the data points as electrical charges on the $z=0$ hyperplane in a space augmented with an additional dimension $z$, generating a high-dimensional electric field (the gradient of the solution to Poisson equation).

Image Generation

A Survey on Generative Diffusion Model

1 code implementation6 Sep 2022 Hanqun Cao, Cheng Tan, Zhangyang Gao, Yilun Xu, Guangyong Chen, Pheng-Ann Heng, Stan Z. Li

Deep generative models are a prominent approach for data generation, and have been used to produce high quality samples in various domains.

Dimensionality Reduction

Controlling Directions Orthogonal to a Classifier

1 code implementation ICLR 2022 Yilun Xu, Hao He, Tianxiao Shen, Tommi Jaakkola

We propose to identify directions invariant to a given classifier so that these directions can be controlled in tasks such as style transfer.

Domain Adaptation Fairness +1

Novel Intensity Mapping Functions: Weighted Histogram Averaging

no code implementations14 Nov 2021 Yilun Xu, Zhengguo Li, Weihai Chen, Changyun Wen

It is challenging to align the brightness distribution of the images with different exposures due to possible color distortion and loss of details in the brightest and darkest regions of input images.

Deep Joint Demosaicing and High Dynamic Range Imaging within a Single Shot

no code implementations14 Nov 2021 Yilun Xu, Ziyang Liu, Xingming Wu, Weihai Chen, Changyun Wen, Zhengguo Li

For the former challenge, a spatially varying convolution (SVC) is designed to process the Bayer images carried with varying exposures.

Demosaicking

Learning Representations that Support Robust Transfer of Predictors

2 code implementations19 Oct 2021 Yilun Xu, Tommi Jaakkola

We further demonstrate the impact of optimizing such transfer risk on two controlled settings, each representing a different pattern of environment shift, as well as on two real-world datasets.

Domain Generalization Out-of-Distribution Generalization

Can Subnetwork Structure be the Key to Out-of-Distribution Generalization?

no code implementations5 Jun 2021 Dinghuai Zhang, Kartik Ahuja, Yilun Xu, Yisen Wang, Aaron Courville

Can models with particular structure avoid being biased towards spurious correlation in out-of-distribution (OOD) generalization?

Out-of-Distribution Generalization

Anytime Sampling for Autoregressive Models via Ordered Autoencoding

1 code implementation ICLR 2021 Yilun Xu, Yang song, Sahaj Garg, Linyuan Gong, Rui Shu, Aditya Grover, Stefano Ermon

Experimentally, we demonstrate in several image and audio generation tasks that sample quality degrades gracefully as we reduce the computational budget for sampling.

Audio Generation Computational Efficiency

TCGM: An Information-Theoretic Framework for Semi-Supervised Multi-Modality Learning

no code implementations ECCV 2020 Xinwei Sun, Yilun Xu, Peng Cao, Yuqing Kong, Lingjing Hu, Shanghang Zhang, Yizhou Wang

In this paper, we propose a novel information-theoretic approach, namely \textbf{T}otal \textbf{C}orrelation \textbf{G}ain \textbf{M}aximization (TCGM), for semi-supervised multi-modal learning, which is endowed with promising properties: (i) it can utilize effectively the information across different modalities of unlabeled data points to facilitate training classifiers of each modality (ii) it has theoretical guarantee to identify Bayesian classifiers, i. e., the ground truth posteriors of all modalities.

Disease Prediction Emotion Recognition +1

L_DMI: A Novel Information-theoretic Loss Function for Training Deep Nets Robust to Label Noise

no code implementations NeurIPS 2019 Yilun Xu, Peng Cao, Yuqing Kong, Yizhou Wang

To the best of our knowledge, L_DMI is the first loss function that is provably robust to instance-independent label noise, regardless of noise pattern, and it can be applied to any existing classification neural networks straightforwardly without any auxiliary information.

Ranked #35 on Image Classification on Clothing1M (using extra training data)

Learning with noisy labels

L_DMI: An Information-theoretic Noise-robust Loss Function

2 code implementations8 Sep 2019 Yilun Xu, Peng Cao, Yuqing Kong, Yizhou Wang

\emph{To the best of our knowledge, $\mathcal{L}_{DMI}$ is the first loss function that is provably robust to instance-independent label noise, regardless of noise pattern, and it can be applied to any existing classification neural networks straightforwardly without any auxiliary information}.

Learning with noisy labels

Max-MIG: an Information Theoretic Approach for Joint Learning from Crowds

1 code implementation ICLR 2019 Peng Cao, Yilun Xu, Yuqing Kong, Yizhou Wang

Furthermore, we devise an accurate data-crowds forecaster that employs both the data and the crowdsourced labels to forecast the ground truth.

Cannot find the paper you are looking for? You can Submit a new open access paper.