Search Results for author: Yash Deshpande

Found 9 papers, 2 papers with code

Near-optimal inference in adaptive linear regression

no code implementations5 Jul 2021 Koulik Khamaru, Yash Deshpande, Tor Lattimore, Lester Mackey, Martin J. Wainwright

We propose a family of online debiasing estimators to correct these distributional anomalies in least squares estimation.

Active Learning regression +2

Contextual Stochastic Block Models

no code implementations NeurIPS 2018 Yash Deshpande, Andrea Montanari, Elchanan Mossel, Subhabrata Sen

We provide the first information theoretic tight analysis for inference of latent community structure given a sparse graph along with high dimensional node covariates, correlated with the same latent communities.

Inference in Graphical Models via Semidefinite Programming Hierarchies

no code implementations NeurIPS 2017 Murat A. Erdogdu, Yash Deshpande, Andrea Montanari

We demonstrate that the resulting algorithm can solve problems with tens of thousands of variables within minutes, and outperforms BP and GBP on practical problems such as image denoising and Ising spin glasses.

Combinatorial Optimization Computational Efficiency +1

Improved Sum-of-Squares Lower Bounds for Hidden Clique and Hidden Submatrix Problems

no code implementations23 Feb 2015 Yash Deshpande, Andrea Montanari

Here we consider the degree-$4$ SOS relaxation, and study the construction of \cite{meka2013association} to prove that SOS fails unless $k\ge C\, n^{1/3}/\log n$.

Two-sample testing

Sparse PCA via Covariance Thresholding

no code implementations NeurIPS 2014 Yash Deshpande, Andrea Montanari

In an influential paper, \cite{johnstone2004sparse} introduced a simple algorithm that estimates the support of the principal vectors $\mathbf{v}_1,\dots,\mathbf{v}_r$ by the largest entries in the diagonal of the empirical covariance.

Cannot find the paper you are looking for? You can Submit a new open access paper.