Search Results for author: John Lafferty

Found 17 papers, 1 papers with code

Convergence and Alignment of Gradient Descent with Random Backpropagation Weights

no code implementations NeurIPS 2021 Ganlin Song, Ruitu Xu, John Lafferty

In this paper we study the mathematical properties of the feedback alignment procedure by analyzing convergence and alignment for two-layer networks under squared error loss.

Convergence and Alignment of Gradient Descent with Random Back Propagation Weights

no code implementations10 Jun 2021 Ganlin Song, Ruitu Xu, John Lafferty

In this paper we study the mathematical properties of the feedback alignment procedure by analyzing convergence and alignment for two-layer networks under squared error loss.

The huge Package for High-dimensional Undirected Graph Estimation in R

no code implementations26 Jun 2020 Tuo Zhao, Han Liu, Kathryn Roeder, John Lafferty, Larry Wasserman

We describe an R package named huge which provides easy-to-use functions for estimating high dimensional undirected graphs from data.

Model Selection

Model Repair: Robust Recovery of Over-Parameterized Statistical Models

no code implementations20 May 2020 Chao Gao, John Lafferty

A new type of robust estimation problem is introduced where the goal is to recover a statistical model that has been corrupted after it has been estimated from data.

Fair quantile regression

no code implementations19 Jul 2019 Dana Yang, John Lafferty, David Pollard

Quantile regression is a tool for learning conditional distributions.


Surfing: Iterative optimization over incrementally trained deep networks

1 code implementation NeurIPS 2019 Ganlin Song, Zhou Fan, John Lafferty

When initialized with random parameters $\theta_0$, we show that the objective $f_{\theta_0}(x)$ is "nice'' and easy to optimize with gradient descent.

TopicEq: A Joint Topic and Mathematical Equation Model for Scientific Texts

no code implementations16 Feb 2019 Michihiro Yasunaga, John Lafferty

Scientific documents rely on both mathematics and text to communicate ideas.

 Ranked #1 on Topic Models on arXiv (Topic Coherence@50 metric)

Language Modelling Topic Models

Distributed Nonparametric Regression under Communication Constraints

no code implementations ICML 2018 Yuancheng Zhu, John Lafferty

In an intermediate regime, the statistical risk depends on both the sample size and the communication budget.

Testing for Global Network Structure Using Small Subgraph Statistics

no code implementations2 Oct 2017 Chao Gao, John Lafferty

We study the problem of testing for community structure in networks using relations between the observed frequencies of small subgraphs.

Methodology Social and Information Networks Statistics Theory Applications Statistics Theory

Local Minimax Complexity of Stochastic Convex Optimization

no code implementations NeurIPS 2016 Yuancheng Zhu, Sabyasachi Chatterjee, John Duchi, John Lafferty

The bounds are expressed in terms of a localized and computational analogue of the modulus of continuity that is central to statistical minimax analysis.

Convergence Analysis for Rectangular Matrix Completion Using Burer-Monteiro Factorization and Gradient Descent

no code implementations23 May 2016 Qinqing Zheng, John Lafferty

We address the rectangular matrix completion problem by lifting the unknown matrix to a positive semidefinite matrix in higher dimension, and optimizing a nonconvex objective over the semidefinite factor using a simple gradient descent scheme.

Matrix Completion

A Convergent Gradient Descent Algorithm for Rank Minimization and Semidefinite Programming from Random Linear Measurements

no code implementations NeurIPS 2015 Qinqing Zheng, John Lafferty

We propose a simple, scalable, and fast gradient descent algorithm to optimize a nonconvex objective for the rank minimization problem and a closely related family of semidefinite programs.

Quantized Nonparametric Estimation over Sobolev Ellipsoids

no code implementations25 Mar 2015 Yuancheng Zhu, John Lafferty

We formulate the notion of minimax estimation under storage or communication constraints, and prove an extension to Pinsker's theorem for nonparametric estimation over Sobolev ellipsoids.


Blossom Tree Graphical Models

no code implementations NeurIPS 2014 Zhe Liu, John Lafferty

We combine the ideas behind trees and Gaussian graphical models to form a new nonparametric family of graphical models.

Quantized Estimation of Gaussian Sequence Models in Euclidean Balls

no code implementations NeurIPS 2014 Yuancheng Zhu, John Lafferty

A central result in statistical theory is Pinsker's theorem, which characterizes the minimax rate in the normal means model of nonparametric estimation.

Cannot find the paper you are looking for? You can Submit a new open access paper.