Search Results for author: Max Hopkins

Found 11 papers, 0 papers with code

Stability is Stable: Connections between Replicability, Privacy, and Adaptive Generalization

no code implementations22 Mar 2023 Mark Bun, Marco Gaboardi, Max Hopkins, Russell Impagliazzo, Rex Lei, Toniann Pitassi, Satchit Sivakumar, Jessica Sorrell

In particular, we give sample-efficient algorithmic reductions between perfect generalization, approximate differential privacy, and replicability for a broad class of statistical problems.

PAC learning

Do PAC-Learners Learn the Marginal Distribution?

no code implementations13 Feb 2023 Max Hopkins, Daniel M. Kane, Shachar Lovett, Gaurav Mahajan

We study a foundational variant of Valiant and Vapnik and Chervonenkis' Probably Approximately Correct (PAC)-Learning in which the adversary is restricted to a known family of marginal distributions $\mathscr{P}$.

PAC learning

Robust Empirical Risk Minimization with Tolerance

no code implementations2 Oct 2022 Robi Bhattacharjee, Max Hopkins, Akash Kumar, Hantao Yu, Kamalika Chaudhuri

Developing simple, sample-efficient learning algorithms for robust classification is a pressing issue in today's tech-dominated world, and current theoretical techniques requiring exponential sample complexity and complicated improper learning rules fall far from answering the need.

Robust classification

Active Learning Polynomial Threshold Functions

no code implementations24 Jan 2022 Omri Ben-Eliezer, Max Hopkins, Chutong Yang, Hantao Yu

We initiate the study of active learning polynomial threshold functions (PTFs).

Active Learning

Realizable Learning is All You Need

no code implementations8 Nov 2021 Max Hopkins, Daniel M. Kane, Shachar Lovett, Gaurav Mahajan

The equivalence of realizable and agnostic learnability is a fundamental phenomenon in learning theory.

Learning Theory PAC learning

Bounded Memory Active Learning through Enriched Queries

no code implementations9 Feb 2021 Max Hopkins, Daniel Kane, Shachar Lovett, Michal Moshkovitz

The explosive growth of easily-accessible unlabeled data has lead to growing interest in active learning, a paradigm in which data-hungry learning algorithms adaptively select informative examples in order to lower prohibitively expensive labeling costs.

Active Learning

Point Location and Active Learning: Learning Halfspaces Almost Optimally

no code implementations23 Apr 2020 Max Hopkins, Daniel M. Kane, Shachar Lovett, Gaurav Mahajan

Given a finite set $X \subset \mathbb{R}^d$ and a binary linear classifier $c: \mathbb{R}^d \to \{0, 1\}$, how many queries of the form $c(x)$ are required to learn the label of every point in $X$?

Active Learning Position

Noise-tolerant, Reliable Active Classification with Comparison Queries

no code implementations15 Jan 2020 Max Hopkins, Daniel Kane, Shachar Lovett, Gaurav Mahajan

With the explosion of massive, widely available unlabeled data in the past years, finding label and time efficient, robust learning algorithms has become ever more important in theory and in practice.

Active Learning Classification +1

A Novel CMB Component Separation Method: Hierarchical Generalized Morphological Component Analysis

no code implementations17 Oct 2019 Sebastian Wagner-Carena, Max Hopkins, Ana Diaz Rivero, Cora Dvorkin

We present a novel technique for Cosmic Microwave Background (CMB) foreground subtraction based on the framework of blind source separation.

blind source separation

The Power of Comparisons for Actively Learning Linear Classifiers

no code implementations NeurIPS 2020 Max Hopkins, Daniel M. Kane, Shachar Lovett

While previous results show that active learning performs no better than its supervised alternative for important concept classes such as linear separators, we show that by adding weak distributional assumptions and allowing comparison queries, active learning requires exponentially fewer samples.

Active Learning PAC learning

Simulated Annealing for JPEG Quantization

no code implementations3 Sep 2017 Max Hopkins, Michael Mitzenmacher, Sebastian Wagner-Carena

JPEG is one of the most widely used image formats, but in some ways remains surprisingly unoptimized, perhaps because some natural optimizations would go outside the standard that defines JPEG.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.