1 code implementation • 28 Oct 2024 • Yuqi Gu, Zhongyuan Lyu, Kaizheng Wang
We propose a general transfer learning framework for clustering given a main dataset and an auxiliary one about the same subjects.
1 code implementation • 10 Jun 2024 • Elise Han, Chengpiao Huang, Kaizheng Wang
Distribution-free prediction sets play a pivotal role in uncertainty quantification for complex statistical models.
no code implementations • 23 May 2024 • Kaizheng Wang, Fabio Cuzzolin, Keivan Shariatmadar, David Moens, Hans Hallez
This paper presents an innovative approach, called credal wrapper, to formulating a credal set representation of model averaging for Bayesian neural networks (BNNs) and deep ensembles, capable of improving uncertainty estimation in classification tasks.
Out-of-Distribution Detection Out of Distribution (OOD) Detection
1 code implementation • 13 Feb 2024 • Elise Han, Chengpiao Huang, Kaizheng Wang
We investigate model assessment and selection in a changing environment, by synthesizing datasets from both the current time period and historical epochs.
no code implementations • 10 Jan 2024 • Kaizheng Wang, Keivan Shariatmadar, Shireen Kudukkil Manchingal, Fabio Cuzzolin, David Moens, Hans Hallez
Uncertainty estimation is increasingly attractive for improving the reliability of neural networks.
no code implementations • 27 Oct 2023 • Chengpiao Huang, Kaizheng Wang
We develop a versatile framework for statistical learning in non-stationary environments.
no code implementations • 11 Jul 2023 • Shireen Kudukkil Manchingal, Muhammad Mubashar, Kaizheng Wang, Keivan Shariatmadar, Fabio Cuzzolin
RS-NN encodes the 'epistemic' uncertainty induced in machine learning by limited training sets via the size of the credal sets associated with the predicted belief functions.
1 code implementation • 20 Feb 2023 • Kaizheng Wang
We develop and analyze a principled approach to kernel ridge regression under covariate shift.
no code implementations • 17 Feb 2023 • Yang Yang, Kaixiong Xu, Kaizheng Wang
On the other hand, the cross-modal attention feature fusion module mines the features of both Color and Thermal modalities to complement each other, then the global features are constructed by adding the cross-modal complemented features element by element, which are attentionally weighted to achieve the effective fusion of the two modal features.
no code implementations • 4 Jan 2023 • Yuling Yan, Kaizheng Wang, Philippe Rigollet
Gaussian mixture models form a flexible and expressive parametric family of distributions that has found applications in a wide variety of applications.
no code implementations • 15 Dec 2022 • Kaizheng Wang, Xiao Xu, Xun Yu Zhou
We study a multi-factor block model for variable clustering and connect it to the regularized subspace clustering by formulating a distributionally robust version of the nodewise regression.
no code implementations • 1 Dec 2022 • Keivan Shariatmadar, Kaizheng Wang, Calvin R. Hubbard, Hans Hallez, David Moens
The goal of this survey paper is to briefly touch upon the state of the art in a variety of different methods and refer the reader to other literature for more in-depth treatments of the topics discussed here.
no code implementations • 22 Oct 2022 • Henry Lam, Kaizheng Wang, Yuhang Wu, Yichen Zhang
We study the problem of multi-task non-smooth optimization that arises ubiquitously in statistical learning, decision-making and risk management.
1 code implementation • 10 Feb 2022 • Yaqi Duan, Kaizheng Wang
We study the multi-task learning problem that aims to simultaneously analyze multiple datasets collected from different sources and learn one model for each of them.
no code implementations • 4 Oct 2021 • Damek Davis, Mateo Díaz, Kaizheng Wang
We investigate a clustering problem with data from a mixture of Gaussians that share a common but unknown, and potentially ill-conditioned, covariance matrix.
no code implementations • 19 Mar 2021 • Shuchen Liu, Kaizheng Wang, Dirk Abel
In autonomous applications for mobility and transport, a high-rate and highly accurate vehicle-state estimation is achieved by fusing measurements of global navigation satellite systems (GNSS) and inertial sensors.
no code implementations • 24 Jun 2020 • Emmanuel Abbe, Jianqing Fan, Kaizheng Wang
Principal Component Analysis (PCA) is a powerful tool in statistics and machine learning.
no code implementations • NeurIPS 2020 • Kaizheng Wang, Yuling Yan, Mateo Díaz
This paper considers a canonical clustering problem where one receives unlabeled samples drawn from a balanced mixture of two elliptical distributions and aims for a classifier to estimate the labels.
no code implementations • 31 Dec 2019 • Kaizheng Wang
This paper presents compact notations for concentration inequalities and convenient results to streamline probabilistic analysis.
no code implementations • 12 Jun 2019 • Jianqing Fan, Yongyi Guo, Kaizheng Wang
In addition, we give the conditions under which the one-step CEASE estimator is statistically efficient.
no code implementations • 12 Aug 2018 • Jianqing Fan, Kaizheng Wang, Yiqiao Zhong, Ziwei Zhu
Factor models are a class of powerful statistical models that have been widely used to deal with dependent measurements that arise frequently from various applications from genomics and neuroscience to economics and finance.
no code implementations • ICML 2018 • Cong Ma, Kaizheng Wang, Yuejie Chi, Yuxin Chen
Focusing on two statistical estimation problems, i. e. solving random quadratic systems of equations and low-rank matrix completion, we establish that gradient descent achieves near-optimal statistical and computational guarantees without explicit regularization.
no code implementations • ICML 2018 • Cong Ma, Kaizheng Wang, Yuejie Chi, Yuxin Chen
Recent years have seen a flurry of activities in designing provably efficient nonconvex procedures for solving statistical estimation problems.
no code implementations • 31 Jul 2017 • Yuxin Chen, Jianqing Fan, Cong Ma, Kaizheng Wang
This paper is concerned with the problem of top-$K$ ranking from pairwise comparisons.
1 code implementation • 27 Dec 2016 • Jianqing Fan, Yuan Ke, Kaizheng Wang
This paper studies model selection consistency for high dimensional sparse regression when data exhibits both cross-sectional and serial dependency.
Methodology