no code implementations • 19 Jul 2022 • Aaron Berk, Simone Brugiapaglia, Babhru Joshi, Yaniv Plan, Matthew Scott, Özgür Yılmaz
In Bora et al. (2017), a mathematical framework was developed for compressed sensing guarantees in the setting where the measurement matrix is Gaussian and the signal structure is the range of a generative neural network (GNN).
no code implementations • NeurIPS 2021 • Babhru Joshi, Xiaowei Li, Yaniv Plan, Ozgur Yilmaz
We prove that, when weights are Gaussian and layer widths $n_i \gtrsim 5^i n_0$ (up to log factors), the algorithm converges geometrically to a neighbourhood of $x$ with high probability.
no code implementations • NeurIPS Workshop Deep_Invers 2021 • Babhru Joshi, Xiaowei Li, Yaniv Plan, Ozgur Yilmaz
After a sufficient number of iterations, the estimation errors for both $x$ and $\mathcal{G}(x)$ are at most in the order of $\sqrt{4^dn_0/m} \|\epsilon\|$.
1 code implementation • 14 Oct 2020 • Zhenan Fan, Halyun Jeong, Babhru Joshi, Michael P. Friedlander
The signal demixing problem seeks to separate a superposition of multiple signals into its constituent components.
1 code implementation • NeurIPS 2019 • Paul Hand, Babhru Joshi
That is, the objective function has a descent direction at every point outside of a small neighborhood around four hyperbolic curves.
no code implementations • 19 Dec 2016 • Paul Hand, Babhru Joshi
We introduce a convex approach for mixed linear regression over $d$ features.