no code implementations • 17 Apr 2018 • Cong Chen, Kim Batselier, Ching-Yun Ko, Ngai Wong
There has been growing interest in extending traditional vector-based machine learning techniques to their tensor forms.
no code implementations • 9 Nov 2018 • Ching-Yun Ko, Cong Chen, Yuke Zhang, Kim Batselier, Ngai Wong
Sum-product networks (SPNs) represent an emerging class of neural networks with clear probabilistic semantics and superior inference speed over graphical models.
no code implementations • 12 Nov 2018 • Cong Chen, Kim Batselier, Ching-Yun Ko, Ngai Wong
This work presents the matrix product operator RBM (MPORBM) that utilizes a tensor network generalization of Mv/TvRBM, preserves input formats in both the visible and hidden layers, and results in higher expressive power.
no code implementations • 2 Jan 2020 • Cong Chen, Kim Batselier, Wenjian Yu, Ngai Wong
In this paper, we propose a tensor train (TT)-based kernel technique for the first time, and apply it to the conventional support vector machine (SVM) for image classification.
no code implementations • 21 Dec 2020 • Clara Menzen, Manon Kok, Kim Batselier
Multiway data often naturally occurs in a tensorial format which can be approximately represented by a low-rank tensor decomposition.
no code implementations • 26 Oct 2021 • Maximilian Lucassen, Johan A. K. Suykens, Kim Batselier
Least squares support vector machines are a commonly used supervised learning method for nonlinear regression and classification.
no code implementations • 25 May 2022 • Eva Memmel, Clara Menzen, Jetze Schuurmans, Frederiek Wesel, Kim Batselier
Furthermore, we argue that better algorithms should be evaluated in terms of both accuracy and efficiency.
no code implementations • 9 May 2023 • Jetze T. Schuurmans, Kim Batselier, Julian F. P. Kooij
While scaling the approximation error commonly is used to account for the different sizes of layers, the average correlation across layers is smaller than across all choices (i. e. layers, decompositions, and level of compression) before fine-tuning.
1 code implementation • 11 Sep 2023 • Frederiek Wesel, Kim Batselier
Unless one considers the dual formulation of the learning problem, which renders exact large-scale learning unfeasible, the exponential increase of model parameters in the dimensionality of the data caused by their tensor-product structure prohibits to tackle high-dimensional problems.
no code implementations • 31 Oct 2023 • Clara Menzen, Eva Memmel, Kim Batselier, Manon Kok
The benefit of our approach comes from the projection to a smaller subspace: It modifies the shape of the basis functions in a way that it sees fit based on the given data, and it allows for efficient computations in the smaller subspace.
no code implementations • 7 Mar 2024 • Aron Bevelander, Kim Batselier, Nitin Jonathan Myers
Compressed sensing (CS) techniques demand significant storage and computational resources, when recovering high-dimensional sparse signals.
no code implementations • 28 Mar 2024 • Frederiek Wesel, Kim Batselier
We analyze the convergence of both CPD and TT-constrained models, and show how TT yields models exhibiting more GP behavior compared to CPD, for the same number of model parameters.
1 code implementation • 7 Jul 2014 • Kim Batselier, Haotian Liu, Ngai Wong
We propose a constructive algorithm that decomposes an arbitrary real tensor into a finite sum of orthonormal rank-1 outer products.
Numerical Analysis Numerical Analysis
1 code implementation • 20 Dec 2016 • Zhongming Chen, Kim Batselier, Johan A. K. Suykens, Ngai Wong
In pattern classification, polynomial classifiers are well-studied methods as they are capable of generating complex decision surfaces.
1 code implementation • 17 Apr 2018 • Ching-Yun Ko, Kim Batselier, Wenjian Yu, Ngai Wong
We propose a new tensor completion method based on tensor trains.
1 code implementation • 18 Oct 2016 • Kim Batselier, Zhongming Chen, Ngai Wong
This article introduces a Tensor Network Kalman filter, which can estimate state vectors that are exponentially large without ever having to explicitly construct them.
Systems and Control
1 code implementation • 17 Mar 2020 • Ridvan Karagoz, Kim Batselier
This article introduces the Tensor Network B-spline model for the regularized identification of nonlinear systems using a nonlinear autoregressive exogenous (NARX) approach.
1 code implementation • NeurIPS 2021 • Frederiek Wesel, Kim Batselier
Random Fourier features provide a way to tackle large-scale machine learning problems with kernel methods.