no code implementations • 2 Feb 2024 • Yingyi Chen, Qinghua Tao, Francesco Tonin, Johan A. K. Suykens
In this work, we propose Kernel-Eigen Pair Sparse Variational Gaussian Processes (KEP-SVGP) for building uncertainty-aware self-attention where the asymmetry of attention kernels is tackled by Kernel SVD (KSVD) and a reduced complexity is acquired.
no code implementations • 12 Jun 2023 • Qinghua Tao, Francesco Tonin, Panagiotis Patrinos, Johan A. K. Suykens
We describe a nonlinear extension of the matrix Singular Value Decomposition through asymmetric kernels, namely KSVD.
no code implementations • 12 Jun 2023 • Francesco Tonin, Panagiotis Patrinos, Johan A. K. Suykens
In the context of deep learning with kernel machines, the deep Restricted Kernel Machine (DRKM) framework allows multiple levels of kernel PCA (KPCA) and Least-Squares Support Vector Machines (LSSVM) to be combined into a deep architecture using visible and hidden units.
1 code implementation • 9 Jun 2023 • Francesco Tonin, Alex Lambert, Panagiotis Patrinos, Johan A. K. Suykens
The goal of this paper is to revisit Kernel Principal Component Analysis (KPCA) through dualization of a difference of convex functions.
1 code implementation • NeurIPS 2023 • Yingyi Chen, Qinghua Tao, Francesco Tonin, Johan A. K. Suykens
To the best of our knowledge, this is the first work that provides a primal-dual representation for the asymmetric kernel in self-attention and successfully applies it to modeling and optimization.
Ranked #2 on Offline RL on D4RL
1 code implementation • 22 Feb 2023 • Francesco Tonin, Qinghua Tao, Panagiotis Patrinos, Johan A. K. Suykens
Principal Component Analysis (PCA) and its nonlinear extension Kernel PCA (KPCA) are widely used across science and industry for data analysis and dimensionality reduction.
1 code implementation • 31 Jan 2023 • Sonny Achten, Francesco Tonin, Panagiotis Patrinos, Johan A. K. Suykens
We present a deep Graph Convolutional Kernel Machine (GCKM) for semi-supervised node classification in graphs.
1 code implementation • 23 Jul 2022 • Qinghua Tao, Francesco Tonin, Panagiotis Patrinos, Johan A. K. Suykens
In our method, the dual variables, playing the role of hidden features, are shared by all views to construct a common latent space, coupling the views by learning projections from view-specific spaces.
1 code implementation • 16 Feb 2021 • Francesco Tonin, Arun Pandey, Panagiotis Patrinos, Johan A. K. Suykens
Detecting out-of-distribution (OOD) samples is an essential requirement for the deployment of machine learning systems in the real world.
Out-of-Distribution Detection Out of Distribution (OOD) Detection
no code implementations • 25 Nov 2020 • Francesco Tonin, Panagiotis Patrinos, Johan A. K. Suykens
We introduce Constr-DRKM, a deep kernel method for the unsupervised learning of disentangled data representations.