Search Results for author: Jessica Sorrell

Found 5 papers, 1 papers with code

Stability is Stable: Connections between Replicability, Privacy, and Adaptive Generalization

no code implementations22 Mar 2023 Mark Bun, Marco Gaboardi, Max Hopkins, Russell Impagliazzo, Rex Lei, Toniann Pitassi, Satchit Sivakumar, Jessica Sorrell

In particular, we give sample-efficient algorithmic reductions between perfect generalization, approximate differential privacy, and replicability for a broad class of statistical problems.

PAC learning

Multicalibration as Boosting for Regression

1 code implementation31 Jan 2023 Ira Globus-Harris, Declan Harrison, Michael Kearns, Aaron Roth, Jessica Sorrell

Using this characterization, we give an exceedingly simple algorithm that can be analyzed both as a boosting algorithm for regression and as a multicalibration algorithm for a class H that makes use only of a standard squared error regression oracle for H. We give a weak learning assumption on H that ensures convergence to Bayes optimality without the need to make any realizability assumptions -- giving us an agnostic boosting algorithm for regression.

regression

Reproducibility in Learning

no code implementations20 Jan 2022 Russell Impagliazzo, Rex Lei, Toniann Pitassi, Jessica Sorrell

We introduce the notion of a reproducible algorithm in the context of learning.

Boosting in the Presence of Massart Noise

no code implementations14 Jun 2021 Ilias Diakonikolas, Russell Impagliazzo, Daniel Kane, Rex Lei, Jessica Sorrell, Christos Tzamos

Our upper and lower bounds characterize the complexity of boosting in the distribution-independent PAC model with Massart noise.

Efficient, Noise-Tolerant, and Private Learning via Boosting

no code implementations4 Feb 2020 Mark Bun, Marco Leandro Carmosino, Jessica Sorrell

To demonstrate our framework, we use it to construct noise-tolerant and private PAC learners for large-margin halfspaces whose sample complexity does not depend on the dimension.

Cannot find the paper you are looking for? You can Submit a new open access paper.