1 code implementation • 30 Mar 2024 • Tao Li, Qinghua Tao, Weihao Yan, Zehao Lei, Yingwen Wu, Kun Fang, Mingzhen He, Xiaolin Huang
Improving the generalization ability of modern deep neural networks (DNNs) is a fundamental challenge in machine learning.
no code implementations • 5 Feb 2024 • Kun Fang, Qinghua Tao, Kexin Lv, Mingzhen He, Xiaolin Huang, Jie Yang
Out-of-Distribution (OoD) detection is vital for the reliability of Deep Neural Networks (DNNs).
Out-of-Distribution Detection Out of Distribution (OOD) Detection
1 code implementation • 8 Oct 2023 • Fan He, Mingzhen He, Lei Shi, Xiaolin Huang, Johan A. K. Suykens
To enhance kernel flexibility, this paper introduces the concept of Locally-Adaptive-Bandwidths (LAB) as trainable parameters to enhance the Radial Basis Function (RBF) kernel, giving rise to the LAB RBF kernel.
no code implementations • 18 Sep 2022 • Mingzhen He, Fan He, Fanghui Liu, Xiaolin Huang
The theoretical foundation of RFFs is based on the Bochner theorem that relates symmetric, positive definite (PD) functions to probability measures.
no code implementations • 3 Feb 2022 • Mingzhen He, Fan He, Lei Shi, Xiaolin Huang, Johan A. K. Suykens
Asymmetric kernels naturally exist in real life, e. g., for conditional probability and directed graphs.