1 code implementation • 3 Dec 2018 • Cristian Rusu
We describe new results and algorithms for two different, but related, problems which deal with circulant matrices: learning shift-invariant components from training data and calculating the shift (or alignment) between two given signals.
1 code implementation • 19 Nov 2018 • Cristian Rusu
The ability to decompose a signal in an orthonormal basis (a set of orthogonal components, each normalized to have unit length) using a fast numerical procedure rests at the heart of many signal processing methods and applications.
1 code implementation • 9 Dec 2018 • Cristian Rusu
In this paper, we propose several dictionary learning algorithms for sparse representations that also impose specific structures on the learned dictionaries such that they are numerically efficient to use: reduced number of addition/multiplications and even avoiding multiplications altogether.
1 code implementation • 7 Jul 2020 • Cristian Rusu, Paul Irofti
Separable, or Kronecker product, dictionaries provide natural decompositions for 2D signals, such as images.
1 code implementation • 11 Jan 2022 • Paul Irofti, Cristian Rusu, Andrei Pătraşcu
In this paper we use a particular DL formulation that seeks uniform sparse representations model to detect the underlying subspace of the majority of samples in a dataset, using a K-SVD-type algorithm.
no code implementations • 24 Nov 2016 • Cristian Rusu, John Thompson
We also propose a method to construct fast square but non-orthogonal dictionaries that are factorized as a product of few transforms that can be viewed as a further generalization of Givens rotations to the non-orthogonal setting.
no code implementations • 24 Nov 2016 • Cristian Rusu, Nuria Gonzalez-Prelcic, Robert Heath
Dictionary learning is the task of determining a data-dependent transform that yields a sparse representation of some observed data.
no code implementations • 18 Jul 2019 • Cristian Rusu, Lorenzo Rosasco
We study the problem of approximating orthogonal matrices so that their application is numerically fast and yet accurate.
no code implementations • 22 Feb 2020 • Cristian Rusu, Lorenzo Rosasco
We investigate numerically efficient approximations of eigenspaces associated to symmetric and general matrices.
no code implementations • 28 Jun 2019 • Cristian Rusu
In this note, we discuss the shift retrieval problems, both classical and compressed, and provide connections between them using circulant matrices.
no code implementations • 30 Jul 2021 • Cristian Rusu
In this paper, we describe a new algorithm to build a few sparse principal components from a given data matrix.
no code implementations • 7 Apr 2022 • Joan Palacios, Nuria González-Prelcic, Cristian Rusu
Compressive approaches provide a means of effective channel high resolution channel estimates in millimeter wave MIMO systems, despite the use of analog and hybrid architectures.
no code implementations • 24 Aug 2022 • Joan Palacios, Nuria González-Prelcic, Cristian Rusu
Greedy approaches in general, and orthogonal matching pursuit in particular, are the most commonly used sparse recovery techniques in a wide range of applications.
no code implementations • 13 Jul 2023 • Denis C. Ilie-Ablachim, Bogdan Dumitrescu, Cristian Rusu
This paper presents a kernelized version of the t-SNE algorithm, capable of mapping high-dimensional data to a low-dimensional space while preserving the pairwise distances between the data points in a non-Euclidean metric.
1 code implementation • 5 Mar 2024 • Andrei Pătraşcu, Cristian Rusu, Paul Irofti
Sparsifying transforms became in the last decades widely known tools for finding structured sparse representations of signals in certain transform domains.