Search Results for author: Rocco Servedio

Found 7 papers, 0 papers with code

Near-Optimal Statistical Query Lower Bounds for Agnostically Learning Intersections of Halfspaces with Gaussian Marginals

no code implementations10 Feb 2022 Daniel Hsu, Clayton Sanford, Rocco Servedio, Emmanouil-Vasileios Vlatakis-Gkaragkounis

This lower bound is essentially best possible since an SQ algorithm of Klivans et al. (2008) agnostically learns this class to any constant excess error using $n^{O(\log k)}$ queries of tolerance $n^{-O(\log k)}$.

Learning sparse mixtures of rankings from noisy information

no code implementations3 Nov 2018 Anindya De, Ryan O'Donnell, Rocco Servedio

We study the problem of learning an unknown mixture of $k$ rankings over $n$ elements, given access to noisy samples drawn from the unknown mixture.

Sharp bounds for population recovery

no code implementations4 Mar 2017 Anindya De, Ryan O'Donnell, Rocco Servedio

The population recovery problem is a basic problem in noisy unsupervised learning that has attracted significant research attention in recent years [WY12, DRWY12, MS13, BIMP13, LZ15, DST16].

Optimal mean-based algorithms for trace reconstruction

no code implementations9 Dec 2016 Anindya De, Ryan O'Donnell, Rocco Servedio

For any constant deletion rate $0 < \delta < 1$, we give a mean-based algorithm that uses $\exp(O(n^{1/3}))$ time and traces; we also prove that any mean-based algorithm must use at least $\exp(\Omega(n^{1/3}))$ traces.

Learning large-margin halfspaces with more malicious noise

no code implementations NeurIPS 2011 Phil Long, Rocco Servedio

We describe a simple algorithm that runs in time poly(n, 1/gamma, 1/eps) and learns an unknown n-dimensional gamma-margin halfspace to accuracy 1-eps in the presence of malicious noise, when the noise rate is allowed to be as high as Theta(eps gamma sqrt(log(1/gamma))).

Algorithms and hardness results for parallel large margin learning

no code implementations NeurIPS 2011 Phil Long, Rocco Servedio

Our main negative result deals with boosting, which is a standard approach to learning large-margin halfspaces.

Adaptive Martingale Boosting

no code implementations NeurIPS 2008 Phil Long, Rocco Servedio

In recent work Long and Servedio LS05short presented a ``martingale boosting'' algorithm that works by constructing a branching program over weak classifiers and has a simple analysis based on elementary properties of random walks.

General Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.