no code implementations • 22 Feb 2024 • Adam Block, Alexander Rakhlin, Abhishek Shetty
In order to circumvent statistical and computational hardness results in sequential decision-making, recent work has considered smoothed online learning, where the distribution of data at each time is assumed to have bounded likeliehood ratio with respect to a base measure when conditioned on the history.
no code implementations • 13 Feb 2024 • Adam Block, Mark Bun, Rathin Desai, Abhishek Shetty, Steven Wu
Due to statistical lower bounds on the learnability of many function classes under privacy constraints, there has been recent interest in leveraging public data to improve the performance of private learning algorithms.
no code implementations • 17 Oct 2023 • Adam Block, Dylan J. Foster, Akshay Krishnamurthy, Max Simchowitz, Cyril Zhang
This work studies training instabilities of behavior cloning with deep neural networks.
no code implementations • NeurIPS 2023 • Zakaria Mhammedi, Adam Block, Dylan J. Foster, Alexander Rakhlin
A major challenge in reinforcement learning is to develop practical, sample-efficient algorithms for exploration in high-dimensional domains where generalization and function approximation is required.
no code implementations • 10 Feb 2023 • Adam Block, Alexander Rakhlin, Max Simchowitz
Smoothed online learning has emerged as a popular framework to mitigate the substantial loss in statistical and computational complexity that arises when one moves from classical to adversarial learning.
no code implementations • 9 Feb 2023 • Adam Block, Yury Polyanskiy
Suppose we are given access to $n$ independent samples from distribution $\mu$ and we wish to output one of them with the goal of making the output distributed as close as possible to a target distribution $\nu$.
no code implementations • NeurIPS 2023 • Adam Block, Max Simchowitz, Russ Tedrake
The problem of piecewise affine (PWA) regression and planning is of foundational importance to the study of online learning, control, and robotics, where it provides a theoretically and empirically tractable setting to study systems undergoing sharp changes in the dynamics.
no code implementations • 25 May 2022 • Adam Block, Max Simchowitz
Due to the drastic gap in complexity between sequential and batch statistical learning, recent work has studied a smoothed sequential learning setting, where Nature is constrained to select contexts with density bounded by 1/{\sigma} with respect to a known measure {\mu}.
no code implementations • 22 Apr 2022 • Adam Block, Rahul Kidambi, Daniel N. Hill, Thorsten Joachims, Inderjit S. Dhillon
A shortcoming of this approach is that users often do not know which query will provide the best retrieval performance on the current information retrieval system, meaning that any query autocompletion methods trained to mimic user behavior can lead to suboptimal query suggestions.
no code implementations • 9 Feb 2022 • Adam Block, Yuval Dagan, Noah Golowich, Alexander Rakhlin
We then prove a lower bound on the oracle complexity of any proper learning algorithm, which matches the oracle-efficient upper bounds up to a polynomial factor, thus demonstrating the existence of a statistical-computational gap in smooth online learning.
no code implementations • 8 Jun 2021 • Adam Block, Zeyu Jia, Yury Polyanskiy, Alexander Rakhlin
It has long been thought that high-dimensional data encountered in many practical machine learning tasks have low-dimensional structure, i. e., the manifold hypothesis holds.
no code implementations • 2 Feb 2021 • Adam Block, Yuval Dagan, Sasha Rakhlin
We introduce the technique of generic chaining and majorizing measures for controlling sequential Rademacher complexity.
no code implementations • 19 Jun 2020 • Adam Block, Youssef Mroueh, Alexander Rakhlin, Jerret Ross
Recently, the task of image generation has attracted much attention.
no code implementations • 31 Jan 2020 • Adam Block, Youssef Mroueh, Alexander Rakhlin
We show that both DAE and DSM provide estimates of the score of the Gaussian smoothed population density, allowing us to apply the machinery of Empirical Processes.