no code implementations • 26 Nov 2022 • Valentin Debarnot, Sidharth Gupta, Konik Kothari, Ivan Dokmanic
We show that our approach enables the recovery of high-frequency details that are destroyed without accounting for deformations.
1 code implementation • 18 Nov 2022 • Sidharth Gupta, Konik Kothari, Valentin Debarnot, Ivan Dokmanić
We propose a differentiable imaging framework to address uncertainty in measurement coordinates such as sensor locations and projection angles.
1 code implementation • 22 Nov 2021 • Morteza Rezanejad, Sidharth Gupta, Chandra Gummaluru, Ryan Marten, John Wilder, Michael Gruninger, Dirk B. Walther
Humans are excellent at perceiving illusory outlines.
1 code implementation • 1 Feb 2021 • Sidharth Gupta, Ivan Dokmanić
We address the phase retrieval problem with errors in the sensing vectors.
no code implementations • 11 Feb 2020 • Sidharth Gupta, Parijat Dube, Ashish Verma
Projected Gradient Descent (PGD) based adversarial training has become one of the most prominent methods for building robust deep neural network models.
1 code implementation • 4 Nov 2019 • Sidharth Gupta, Rémi Gribonval, Laurent Daudet, Ivan Dokmanić
Our method simplifies the calibration of optical transmission matrices from a quadratic to a linear inverse problem by first recovering the phase of the measurements.
1 code implementation • NeurIPS 2019 • Sidharth Gupta, Rémi Gribonval, Laurent Daudet, Ivan Dokmanić
A signal of interest $\mathbf{\xi} \in \mathbb{R}^N$ is mixed by a random scattering medium to compute the projection $\mathbf{y} = \mathbf{A} \mathbf{\xi}$, with $\mathbf{A} \in \mathbb{C}^{M \times N}$ being a realization of a standard complex Gaussian iid random matrix.
1 code implementation • 14 Feb 2019 • Shuai Huang, Sidharth Gupta, Ivan Dokmanić
We tackle the problem of recovering a complex signal $\boldsymbol x\in\mathbb{C}^n$ from quadratic measurements of the form $y_i=\boldsymbol x^*\boldsymbol A_i\boldsymbol x$, where $\boldsymbol A_i$ is a full-rank, complex random measurement matrix whose entries are generated from a rotation-invariant sub-Gaussian distribution.
Information Theory Information Theory
1 code implementation • ICLR 2019 • Sidharth Gupta, Konik Kothari, Maarten V. de Hoop, Ivan Dokmanić
We show that in this case the common approach to directly learn the mapping from the measured data to the reconstruction becomes unstable.