Search Results for author: Miles E. Lopes

Found 11 papers, 1 papers with code

Error Estimation for Random Fourier Features

1 code implementation22 Feb 2023 Junwen Yao, N. Benjamin Erichson, Miles E. Lopes

Three key advantages of this approach are: (1) The error estimates are specific to the problem at hand, avoiding the pessimism of worst-case bounds.

Error Estimation for Sketched SVD via the Bootstrap

no code implementations10 Mar 2020 Miles E. Lopes, N. Benjamin Erichson, Michael W. Mahoney

In order to compute fast approximations to the singular value decompositions (SVD) of very large matrices, randomized sketching algorithms have become a leading approach.

Measuring the Algorithmic Convergence of Randomized Ensembles: The Regression Setting

no code implementations4 Aug 2019 Miles E. Lopes, Suofei Wu, Thomas C. M. Lee

When randomized ensemble methods such as bagging and random forests are implemented, a basic question arises: Is the ensemble large enough?

General Classification regression +1

Estimating the Algorithmic Variance of Randomized Ensembles via the Bootstrap

no code implementations20 Jul 2019 Miles E. Lopes

Due to the fact that bagging and random forests are randomized algorithms, the choice of ensemble size is closely related to the notion of "algorithmic variance" (i. e. the variance of prediction error due only to the training algorithm).

Error Estimation for Randomized Least-Squares Algorithms via the Bootstrap

no code implementations ICML 2018 Miles E. Lopes, Shusen Wang, Michael W. Mahoney

As a more practical alternative, we propose a bootstrap method to compute a posteriori error estimates for randomized LS algorithms.

A Bootstrap Method for Error Estimation in Randomized Matrix Multiplication

no code implementations6 Aug 2017 Miles E. Lopes, Shusen Wang, Michael W. Mahoney

In recent years, randomized methods for numerical linear algebra have received growing interest as a general approach to large-scale problems.

Dimensionality Reduction

A Residual Bootstrap for High-Dimensional Regression with Near Low-Rank Designs

no code implementations NeurIPS 2014 Miles E. Lopes

We study the residual bootstrap (RB) method in the context of high-dimensional linear regression.

regression

Unknown sparsity in compressed sensing: Denoising and inference

no code implementations25 Jul 2015 Miles E. Lopes

This family interpolates between $\|x\|_0=s_0(x)$ and $\|x\|_1^2/\|x\|_2^2=s_2(x)$ as $q$ ranges over $[0, 2]$.

Denoising

Estimating a sharp convergence bound for randomized ensembles

no code implementations4 Mar 2013 Miles E. Lopes

In the standard case when classifiers are aggregated by majority vote, the present work offers a way to quantify this convergence in terms of "algorithmic variance," i. e. the variance of prediction error due only to the randomized training algorithm.

Binary Classification Density Estimation

A More Powerful Two-Sample Test in High Dimensions using Random Projection

no code implementations NeurIPS 2011 Miles E. Lopes, Laurent J. Jacob, Martin J. Wainwright

We consider the hypothesis testing problem of detecting a shift between the means of two multivariate normal distributions in the high-dimensional setting, allowing for the data dimension p to exceed the sample size n. Specifically, we propose a new test statistic for the two-sample test of means that integrates a random projection with the classical Hotelling T^2 statistic.

Two-sample testing

Cannot find the paper you are looking for? You can Submit a new open access paper.