Search Results for author: Kim Batselier

Found 13 papers, 6 papers with code

Tensor Network Kalman Filtering for Large-Scale LS-SVMs

no code implementations26 Oct 2021 Maximilian Lucassen, Johan A. K. Suykens, Kim Batselier

Least squares support vector machines are a commonly used supervised learning method for nonlinear regression and classification.

Tensor Networks

Large-Scale Learning with Fourier Features and Tensor Decompositions

1 code implementation NeurIPS 2021 Frederiek Wesel, Kim Batselier

Random Fourier features provide a way to tackle large-scale machine learning problems with kernel methods.

Tensor Decomposition

Alternating linear scheme in a Bayesian framework for low-rank tensor approximation

no code implementations21 Dec 2020 Clara Menzen, Manon Kok, Kim Batselier

Multiway data often naturally occurs in a tensorial format which can be approximately represented by a low-rank tensor decomposition.

Bayesian Inference Tensor Decomposition

Nonlinear system identification with regularized Tensor Network B-splines

1 code implementation17 Mar 2020 Ridvan Karagoz, Kim Batselier

This article introduces the Tensor Network B-spline model for the regularized identification of nonlinear systems using a nonlinear autoregressive exogenous (NARX) approach.

Kernelized Support Tensor Train Machines

no code implementations2 Jan 2020 Cong Chen, Kim Batselier, Wenjian Yu, Ngai Wong

In this paper, we propose a tensor train (TT)-based kernel technique for the first time, and apply it to the conventional support vector machine (SVM) for image classification.

BIG-bench Machine Learning Image Classification

Matrix Product Operator Restricted Boltzmann Machines

no code implementations12 Nov 2018 Cong Chen, Kim Batselier, Ching-Yun Ko, Ngai Wong

This work presents the matrix product operator RBM (MPORBM) that utilizes a tensor network generalization of Mv/TvRBM, preserves input formats in both the visible and hidden layers, and results in higher expressive power.

Denoising Dimensionality Reduction +1

Deep Compression of Sum-Product Networks on Tensor Networks

no code implementations9 Nov 2018 Ching-Yun Ko, Cong Chen, Yuke Zhang, Kim Batselier, Ngai Wong

Sum-product networks (SPNs) represent an emerging class of neural networks with clear probabilistic semantics and superior inference speed over graphical models.

Tensor Networks

A Support Tensor Train Machine

no code implementations17 Apr 2018 Cong Chen, Kim Batselier, Ching-Yun Ko, Ngai Wong

There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms.

BIG-bench Machine Learning

Parallelized Tensor Train Learning of Polynomial Classifiers

1 code implementation20 Dec 2016 Zhongming Chen, Kim Batselier, Johan A. K. Suykens, Ngai Wong

In pattern classification, polynomial classifiers are well-studied methods as they are capable of generating complex decision surfaces.

General Classification

A Tensor Network Kalman filter with an application in recursive MIMO Volterra system identification

1 code implementation18 Oct 2016 Kim Batselier, Zhongming Chen, Ngai Wong

This article introduces a Tensor Network Kalman filter, which can estimate state vectors that are exponentially large without ever having to explicitly construct them.

Systems and Control

A Constructive Algorithm for Decomposing a Tensor into a Finite Sum of Orthonormal Rank-1 Terms

1 code implementation7 Jul 2014 Kim Batselier, Haotian Liu, Ngai Wong

We propose a constructive algorithm that decomposes an arbitrary real tensor into a finite sum of orthonormal rank-1 outer products.

Numerical Analysis Numerical Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.