Search Results for author: Kim Batselier

Found 18 papers, 7 papers with code

Tensor Network-Constrained Kernel Machines as Gaussian Processes

no code implementations28 Mar 2024 Frederiek Wesel, Kim Batselier

We analyze the convergence of both CPD and TT-constrained models, and show how TT yields models exhibiting more GP behavior compared to CPD, for the same number of model parameters.

Gaussian Processes Tensor Networks

A divide-and-conquer approach for sparse recovery of high dimensional signals

no code implementations7 Mar 2024 Aron Bevelander, Kim Batselier, Nitin Jonathan Myers

Compressed sensing (CS) techniques demand significant storage and computational resources, when recovering high-dimensional sparse signals.

Projecting basis functions with tensor networks for Gaussian process regression

no code implementations31 Oct 2023 Clara Menzen, Eva Memmel, Kim Batselier, Manon Kok

The benefit of our approach comes from the projection to a smaller subspace: It modifies the shape of the basis functions in a way that it sees fit based on the given data, and it allows for efficient computations in the smaller subspace.

Bayesian Inference regression +1

Quantized Fourier and Polynomial Features for more Expressive Tensor Network Models

1 code implementation11 Sep 2023 Frederiek Wesel, Kim Batselier

Unless one considers the dual formulation of the learning problem, which renders exact large-scale learning unfeasible, the exponential increase of model parameters in the dimensionality of the data caused by their tensor-product structure prohibits to tackle high-dimensional problems.

Quantization

How Informative is the Approximation Error from Tensor Decomposition for Neural Network Compression?

no code implementations9 May 2023 Jetze T. Schuurmans, Kim Batselier, Julian F. P. Kooij

While scaling the approximation error commonly is used to account for the different sizes of layers, the average correlation across layers is smaller than across all choices (i. e. layers, decompositions, and level of compression) before fine-tuning.

Neural Network Compression Tensor Decomposition

Tensor Network Kalman Filtering for Large-Scale LS-SVMs

no code implementations26 Oct 2021 Maximilian Lucassen, Johan A. K. Suykens, Kim Batselier

Least squares support vector machines are a commonly used supervised learning method for nonlinear regression and classification.

regression Tensor Networks

Large-Scale Learning with Fourier Features and Tensor Decompositions

1 code implementation NeurIPS 2021 Frederiek Wesel, Kim Batselier

Random Fourier features provide a way to tackle large-scale machine learning problems with kernel methods.

Tensor Decomposition

Alternating linear scheme in a Bayesian framework for low-rank tensor approximation

no code implementations21 Dec 2020 Clara Menzen, Manon Kok, Kim Batselier

Multiway data often naturally occurs in a tensorial format which can be approximately represented by a low-rank tensor decomposition.

Bayesian Inference Tensor Decomposition +1

Nonlinear system identification with regularized Tensor Network B-splines

1 code implementation17 Mar 2020 Ridvan Karagoz, Kim Batselier

This article introduces the Tensor Network B-spline model for the regularized identification of nonlinear systems using a nonlinear autoregressive exogenous (NARX) approach.

Kernelized Support Tensor Train Machines

no code implementations2 Jan 2020 Cong Chen, Kim Batselier, Wenjian Yu, Ngai Wong

In this paper, we propose a tensor train (TT)-based kernel technique for the first time, and apply it to the conventional support vector machine (SVM) for image classification.

BIG-bench Machine Learning Image Classification

Matrix Product Operator Restricted Boltzmann Machines

no code implementations12 Nov 2018 Cong Chen, Kim Batselier, Ching-Yun Ko, Ngai Wong

This work presents the matrix product operator RBM (MPORBM) that utilizes a tensor network generalization of Mv/TvRBM, preserves input formats in both the visible and hidden layers, and results in higher expressive power.

Denoising Dimensionality Reduction +1

Deep Compression of Sum-Product Networks on Tensor Networks

no code implementations9 Nov 2018 Ching-Yun Ko, Cong Chen, Yuke Zhang, Kim Batselier, Ngai Wong

Sum-product networks (SPNs) represent an emerging class of neural networks with clear probabilistic semantics and superior inference speed over graphical models.

Tensor Networks

A Support Tensor Train Machine

no code implementations17 Apr 2018 Cong Chen, Kim Batselier, Ching-Yun Ko, Ngai Wong

There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms.

BIG-bench Machine Learning

Parallelized Tensor Train Learning of Polynomial Classifiers

1 code implementation20 Dec 2016 Zhongming Chen, Kim Batselier, Johan A. K. Suykens, Ngai Wong

In pattern classification, polynomial classifiers are well-studied methods as they are capable of generating complex decision surfaces.

General Classification

A Tensor Network Kalman filter with an application in recursive MIMO Volterra system identification

1 code implementation18 Oct 2016 Kim Batselier, Zhongming Chen, Ngai Wong

This article introduces a Tensor Network Kalman filter, which can estimate state vectors that are exponentially large without ever having to explicitly construct them.

Systems and Control

A Constructive Algorithm for Decomposing a Tensor into a Finite Sum of Orthonormal Rank-1 Terms

1 code implementation7 Jul 2014 Kim Batselier, Haotian Liu, Ngai Wong

We propose a constructive algorithm that decomposes an arbitrary real tensor into a finite sum of orthonormal rank-1 outer products.

Numerical Analysis Numerical Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.