# An iterative coordinate descent algorithm to compute sparse low-rank approximations

no code implementations30 Jul 2021

In this paper, we describe a new algorithm to build a few sparse principal components from a given data matrix.

# Efficient and Parallel Separable Dictionary Learning

no code implementations7 Jul 2020,

Separable, or Kronecker product, dictionaries provide natural decompositions for 2D signals, such as images.

# Constructing fast approximate eigenspaces with application to the fast graph Fourier transforms

no code implementations22 Feb 2020,

We investigate numerically efficient approximations of eigenspaces associated to symmetric and general matrices.

# Fast approximation of orthogonal matrices and application to PCA

no code implementations18 Jul 2019,

We study the problem of approximating orthogonal matrices so that their application is numerically fast and yet accurate.

# A Note on Shift Retrieval Problems

no code implementations28 Jun 2019

In this note, we discuss the shift retrieval problems, both classical and compressed, and provide connections between them using circulant matrices.

# Learning Multiplication-free Linear Transformations

1 code implementation9 Dec 2018

In this paper, we propose several dictionary learning algorithms for sparse representations that also impose specific structures on the learned dictionaries such that they are numerically efficient to use: reduced number of addition/multiplications and even avoiding multiplications altogether.

1

# On learning with shift-invariant structures

1 code implementation3 Dec 2018

We describe new results and algorithms for two different, but related, problems which deal with circulant matrices: learning shift-invariant components from training data and calculating the shift (or alignment) between two given signals.

1

# Approximate Eigenvalue Decompositions of Linear Transformations with a Few Householder Reflectors

1 code implementation19 Nov 2018

The ability to decompose a signal in an orthonormal basis (a set of orthogonal components, each normalized to have unit length) using a fast numerical procedure rests at the heart of many signal processing methods and applications.

0

# Fast Orthonormal Sparsifying Transforms Based on Householder Reflectors

Dictionary learning is the task of determining a data-dependent transform that yields a sparse representation of some observed data.

# Learning Fast Sparsifying Transforms

no code implementations24 Nov 2016,

We also propose a method to construct fast square but non-orthogonal dictionaries that are factorized as a product of few transforms that can be viewed as a further generalization of Givens rotations to the non-orthogonal setting.

Cannot find the paper you are looking for? You can Submit a new open access paper.