no code implementations • ICCV 2023 • Zhijie Deng, Yucen Luo
Unsupervised semantic segmentation is a long-standing challenge in computer vision with great significance.
1 code implementation • 31 Oct 2022 • Zeju Qiu, Weiyang Liu, Tim Z. Xiao, Zhen Liu, Umang Bhatt, Yucen Luo, Adrian Weller, Bernhard Schölkopf
We consider the problem of iterative machine teaching, where a teacher sequentially provides examples based on the status of a learner under a discrete input space (i. e., a pool of finite samples), which greatly limits the teacher's capability.
no code implementations • 29 Oct 2022 • Ziyu Wang, Yucen Luo, Yueru Li, Jun Zhu, Bernhard Schölkopf
For nonparametric conditional moment models, efficient estimation often relies on preimposed conditions on various measures of ill-posedness of the hypothesis space, which are hard to validate when flexible models are used.
2 code implementations • 20 Jul 2022 • Francesco Quinzan, Cecilia Casolo, Krikamol Muandet, Yucen Luo, Niki Kilbertus
Notions of counterfactual invariance (CI) have proven essential for predictors that are fair, robust, and generalizable in the real world.
no code implementations • ICLR 2020 • Yucen Luo, Alex Beatson, Mohammad Norouzi, Jun Zhu, David Duvenaud, Ryan P. Adams, Ricky T. Q. Chen
Standard variational lower bounds used to train latent variable models produce biased estimates of most quantities of interest.
1 code implementation • 22 Nov 2019 • Zhijie Deng, Yucen Luo, Jun Zhu, Bo Zhang
Bayesian neural networks (BNNs) augment deep networks with uncertainty quantification by Bayesian treatment of the network weights.
1 code implementation • 25 Sep 2019 • Zhijie Deng, Yucen Luo, Jun Zhu, Bo Zhang
Bayesian neural networks (BNNs) introduce uncertainty estimation to deep networks by performing Bayesian inference on network weights.
no code implementations • 20 Sep 2019 • Yucen Luo, Jun Zhu, Tomas Pfister
Recently deep neural networks have shown their capacity to memorize training data, even with noisy labels, which hurts generalization performance.
1 code implementation • ICCV 2019 • Zhijie Deng, Yucen Luo, Jun Zhu
Deep learning methods have shown promise in unsupervised domain adaptation, which aims to leverage a labeled source domain to learn a classifier for the unlabeled target domain with a different distribution.
Ranked #3 on Domain Adaptation on SVNH-to-MNIST
1 code implementation • NeurIPS 2018 • Yucen Luo, Tian Tian, Jiaxin Shi, Jun Zhu, Bo Zhang
We propose a new approach that includes a deep generative model (DGM) to characterize low-level features of the data, and a statistical relational model for noisy pairwise annotations on its subset.
1 code implementation • CVPR 2018 • Yucen Luo, Jun Zhu, Mengxi Li, Yong Ren, Bo Zhang
In SNTG, a graph is constructed based on the predictions of the teacher model, i. e., the implicit self-ensemble of models.
1 code implementation • 18 Sep 2017 • Jiaxin Shi, Jianfei Chen, Jun Zhu, Shengyang Sun, Yucen Luo, Yihong Gu, Yuhao Zhou
In this paper we introduce ZhuSuan, a python probabilistic programming library for Bayesian deep learning, which conjoins the complimentary advantages of Bayesian methods and deep learning.
no code implementations • NeurIPS 2016 • Yong Ren, Jialian Li, Yucen Luo, Jun Zhu
Maximum mean discrepancy (MMD) has been successfully applied to learn deep generative models for characterizing a joint distribution of variables via kernel mean embedding.