Search Results for author: Cynthia Rush

Found 10 papers, 2 papers with code

Is it easier to count communities than find them?

no code implementations21 Dec 2022 Cynthia Rush, Fiona Skerman, Alexander S. Wein, Dana Yang

In particular, we consider certain hypothesis testing problems between models with different community structures, and we show (in the low-degree polynomial framework) that testing between two options is as hard as finding the communities.

Entropic CLT for Order Statistics

no code implementations10 May 2022 Martina Cardone, Alex Dytso, Cynthia Rush

It is well known that central order statistics exhibit a central limit behavior and converge to a Gaussian distribution as the sample size grows.

Characterizing the SLOPE Trade-off: A Variational Perspective and the Donoho-Tanner Limit

1 code implementation27 May 2021 Zhiqi Bu, Jason Klusowski, Cynthia Rush, Weijie J. Su

Sorted l1 regularization has been incorporated into many methods for solving high-dimensional statistical estimation problems, including the SLOPE estimator in linear regression.

Variable Selection

On the Robustness to Misspecification of $α$-Posteriors and Their Variational Approximations

no code implementations16 Apr 2021 Marco Avella Medina, José Luis Montiel Olea, Cynthia Rush, Amilcar Velez

To make this point, we derive a Bernstein-von Mises theorem showing convergence in total variation distance of $\alpha$-posteriors and their variational approximations to limiting Gaussian distributions.

All-or-nothing statistical and computational phase transitions in sparse spiked matrix estimation

no code implementations NeurIPS 2020 Jean Barbier, Nicolas Macris, Cynthia Rush

We determine statistical and computational limits for estimation of a rank-one matrix (the spike) corrupted by an additive gaussian noise matrix, in a sparse limit, where the underlying hidden vector (that constructs the rank-one matrix) has a number of non-zero components that scales sub-linearly with the total dimension of the vector, and the signal-to-noise ratio tends to infinity at an appropriate speed.

Rigorous State Evolution Analysis for Approximate Message Passing with Side Information

no code implementations25 Mar 2020 Hangjin Liu, Cynthia Rush, Dror Baron

For this reason, a novel algorithmic framework that incorporates SI into AMP, referred to as approximate message passing with side information (AMP-SI), has been recently introduced.

Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing

1 code implementation NeurIPS 2019 Zhiqi Bu, Jason Klusowski, Cynthia Rush, Weijie Su

SLOPE is a relatively new convex optimization procedure for high-dimensional linear regression via the sorted l1 penalty: the larger the rank of the fitted coefficient, the larger the penalty.

An Analysis of State Evolution for Approximate Message Passing with Side Information

no code implementations1 Feb 2019 Hangjin Liu, Cynthia Rush, Dror Baron

For this reason a novel algorithmic framework that incorporates SI into AMP, referred to as approximate message passing with side information (AMP-SI), has been recently introduced.

Information Theory Information Theory

Finite Sample Analysis of Approximate Message Passing Algorithms

no code implementations6 Jun 2016 Cynthia Rush, Ramji Venkataramanan

The concentration inequality also indicates that the number of AMP iterations $t$ can grow no faster than order $\frac{\log n}{\log \log n}$ for the performance to be close to the state evolution predictions with high probability.

Capacity-achieving Sparse Superposition Codes via Approximate Message Passing Decoding

no code implementations23 Jan 2015 Cynthia Rush, Adam Greig, Ramji Venkataramanan

Sparse superposition codes were recently introduced by Barron and Joseph for reliable communication over the AWGN channel at rates approaching the channel capacity.

Cannot find the paper you are looking for? You can Submit a new open access paper.