Search Results for author: Cristian Rusu

Found 10 papers, 3 papers with code

An iterative coordinate descent algorithm to compute sparse low-rank approximations

no code implementations30 Jul 2021 Cristian Rusu

In this paper, we describe a new algorithm to build a few sparse principal components from a given data matrix.

Dimensionality Reduction

Efficient and Parallel Separable Dictionary Learning

no code implementations7 Jul 2020 Cristian Rusu, Paul Irofti

Separable, or Kronecker product, dictionaries provide natural decompositions for 2D signals, such as images.

Dictionary Learning Image Denoising

Constructing fast approximate eigenspaces with application to the fast graph Fourier transforms

no code implementations22 Feb 2020 Cristian Rusu, Lorenzo Rosasco

We investigate numerically efficient approximations of eigenspaces associated to symmetric and general matrices.

Fast approximation of orthogonal matrices and application to PCA

no code implementations18 Jul 2019 Cristian Rusu, Lorenzo Rosasco

We study the problem of approximating orthogonal matrices so that their application is numerically fast and yet accurate.

A Note on Shift Retrieval Problems

no code implementations28 Jun 2019 Cristian Rusu

In this note, we discuss the shift retrieval problems, both classical and compressed, and provide connections between them using circulant matrices.

Learning Multiplication-free Linear Transformations

1 code implementation9 Dec 2018 Cristian Rusu

In this paper, we propose several dictionary learning algorithms for sparse representations that also impose specific structures on the learned dictionaries such that they are numerically efficient to use: reduced number of addition/multiplications and even avoiding multiplications altogether.

Dictionary Learning

On learning with shift-invariant structures

1 code implementation3 Dec 2018 Cristian Rusu

We describe new results and algorithms for two different, but related, problems which deal with circulant matrices: learning shift-invariant components from training data and calculating the shift (or alignment) between two given signals.

Dictionary Learning

Approximate Eigenvalue Decompositions of Linear Transformations with a Few Householder Reflectors

1 code implementation19 Nov 2018 Cristian Rusu

The ability to decompose a signal in an orthonormal basis (a set of orthogonal components, each normalized to have unit length) using a fast numerical procedure rests at the heart of many signal processing methods and applications.

Fast Orthonormal Sparsifying Transforms Based on Householder Reflectors

no code implementations24 Nov 2016 Cristian Rusu, Nuria Gonzalez-Prelcic, Robert Heath

Dictionary learning is the task of determining a data-dependent transform that yields a sparse representation of some observed data.

Dictionary Learning

Learning Fast Sparsifying Transforms

no code implementations24 Nov 2016 Cristian Rusu, John Thompson

We also propose a method to construct fast square but non-orthogonal dictionaries that are factorized as a product of few transforms that can be viewed as a further generalization of Givens rotations to the non-orthogonal setting.

Dictionary Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.