no code implementations • 29 Feb 2024 • Ilmun Kim, Larry Wasserman, Sivaraman Balakrishnan, Matey Neykov
Semi-supervised datasets are ubiquitous across diverse domains where obtaining fully labeled data is costly or time-consuming.
no code implementations • 20 Oct 2022 • Shamindra Shrotriya, Matey Neykov
We study the classical problem of deriving minimax rates for density estimation over convex density classes.
1 code implementation • 14 Jul 2022 • Shamindra Shrotriya, Matey Neykov
We develop \texttt{ASCIFIT}, a three-step estimation procedure under the \texttt{ASCI} setting.
1 code implementation • 15 May 2020 • Yufei Yi, Matey Neykov
Our numerical experiments on simulated and real data, show that our method achieves shorter run time and smaller error in comparison to the ordinary power iteration and some sparse principal component analysis algorithms if the principal eigenvector is in a convex cone.
no code implementations • 21 Sep 2018 • Yuan Cao, Matey Neykov, Han Liu
The goal is to distinguish whether the underlying graph is empty, i. e., the model consists of independent Rademacher variables, versus the alternative that the underlying graph contains a subgraph of a certain structure.
no code implementations • 18 Dec 2017 • Zhuoran Yang, Lin F. Yang, Ethan X. Fang, Tuo Zhao, Zhaoran Wang, Matey Neykov
Existing nonconvex statistical optimization theory and methods crucially rely on the correct specification of the underlying "true" statistical models.
no code implementations • 20 Sep 2017 • Matey Neykov, Han Liu
In terms of methodological development, we propose two types of correlation based tests: computationally efficient screening for ferromagnets, and score type tests for general models, including a fast cycle presence test.
no code implementations • 28 Jul 2017 • Junwei Lu, Matey Neykov, Han Liu
In this paper, we propose a new inferential framework for testing nested multiple hypotheses and constructing confidence intervals of the unknown graph invariants under undirected graphical models.
no code implementations • 18 Jan 2017 • Abhishek Chakrabortty, Matey Neykov, Raymond Carroll, Tianxi Cai
We consider the recovery of regression coefficients, denoted by $\boldsymbol{\beta}_0$, for a single index model (SIM) relating a binary outcome $Y$ to a set of possibly high dimensional covariates $\boldsymbol{X}$, based on a large but 'unlabeled' dataset $\mathcal{U}$, with $Y$ never observed.
no code implementations • NeurIPS 2016 • Matey Neykov, Zhaoran Wang, Han Liu
The goal of noisy high-dimensional phase retrieval is to estimate an $s$-sparse parameter $\boldsymbol{\beta}^*\in \mathbb{R}^d$ from $n$ realizations of the model $Y = (\boldsymbol{X}^{\top} \boldsymbol{\beta}^*)^2 + \varepsilon$.
no code implementations • 10 Aug 2016 • Matey Neykov, Junwei Lu, Han Liu
We propose a new family of combinatorial inference problems for graphical models.
no code implementations • 25 Nov 2015 • Matey Neykov, Jun S. Liu, Tianxi Cai
In the present paper we analyze algorithms based on covariance screening and least squares with $L_1$ penalization (i. e. LASSO) and demonstrate that they can also enjoy optimal (up to a scalar) rescaled sample size in terms of support recovery, albeit under slightly different assumptions on $f$ and $\varepsilon$ compared to the SIR based algorithms.
no code implementations • 7 Nov 2015 • Matey Neykov, Qian Lin, Jun S. Liu
When $s=O(p^{1-\delta})$ for some $\delta>0$, we demonstrate that both procedures can succeed in recovering the support of $\boldsymbol{\beta}$ as long as the rescaled sample size $\kappa=\frac{n}{s\log(p-s)}$ is larger than a certain critical threshold.
no code implementations • 30 Oct 2015 • Matey Neykov, Yang Ning, Jun S. Liu, Han Liu
Our main theoretical contribution is to establish a unified Z-estimation theory of confidence regions for high dimensional problems.