1 code implementation • ICLR 2020 • Roy Mor, Erez Peterfreund, Matan Gavish, Amir Globerson
Are there optimal strategies for the attacker or the authenticator?
no code implementations • 15 Apr 2020 • Erez Peterfreund, Ofir Lindenbaum, Felix Dietrich, Tom Bertalan, Matan Gavish, Ioannis G. Kevrekidis, Ronald R. Coifman
We propose a deep-learning based method for obtaining standardized data coordinates from scientific measurements. Data observations are modeled as samples from an unknown, non-linear deformation of an underlying Riemannian manifold, which is parametrized by a few normalized latent variables.
no code implementations • NeurIPS 2017 • Danny Barash, Matan Gavish
A low rank matrix X has been contaminated by uniformly distributed noise, missing values, outliers and corrupt entries.
no code implementations • 22 May 2017 • Matan Gavish, Regev Schweiger, Elior Rahmani, Eran Halperin
Various problems in data analysis and statistical genetics call for recovery of a column-sparse, low-rank matrix from noisy observations.
1 code implementation • 29 May 2014 • Matan Gavish, David L. Donoho
For a variety of loss functions, including Mean Square Error (MSE - square Frobenius norm), the nuclear norm loss and the operator norm loss, we show that in this framework there is a well-defined asymptotic loss that we evaluate precisely in each case.
Statistics Theory Statistics Theory
no code implementations • 7 Nov 2013 • Moshe Dubiner, Matan Gavish, Yoram Singer
We show existence and a geometric description of the relaxation path.
3 code implementations • 24 May 2013 • Matan Gavish, David L. Donoho
In our asymptotic framework, this thresholding rule adapts to unknown rank and to unknown noise level in an optimal manner: it is always better than hard thresholding at any other value, no matter what the matrix is that we are trying to recover, and is always better than ideal Truncated SVD (TSVD), which truncates at the true rank of the low-rank matrix we are trying to recover.
Methodology