Search Results for author: Guy Kornowski

Found 7 papers, 0 papers with code

Efficient Agnostic Learning with Average Smoothness

no code implementations29 Sep 2023 Steve Hanneke, Aryeh Kontorovich, Guy Kornowski

While the recent work of Hanneke et al. (2023) established tight uniform convergence bounds for average-smooth functions in the realizable case and provided a computationally efficient realizable learning algorithm, both of these results currently lack analogs in the general agnostic (i. e. noisy) case.

An Algorithm with Optimal Dimension-Dependence for Zero-Order Nonsmooth Nonconvex Stochastic Optimization

no code implementations10 Jul 2023 Guy Kornowski, Ohad Shamir

Recent works proposed several stochastic zero-order algorithms that solve this task, all of which suffer from a dimension-dependence of $\Omega(d^{3/2})$ where $d$ is the dimension of the problem, which was conjectured to be optimal.

LEMMA Stochastic Optimization

From Tempered to Benign Overfitting in ReLU Neural Networks

no code implementations NeurIPS 2023 Guy Kornowski, Gilad Yehudai, Ohad Shamir

Thus, we show that the input dimension has a crucial role on the type of overfitting in this setting, which we also validate empirically for intermediate dimensions.

On the Complexity of Finding Small Subgradients in Nonsmooth Optimization

no code implementations21 Sep 2022 Guy Kornowski, Ohad Shamir

We study the oracle complexity of producing $(\delta,\epsilon)$-stationary points of Lipschitz functions, in the sense proposed by Zhang et al. [2020].

Oracle Complexity in Nonsmooth Nonconvex Optimization

no code implementations NeurIPS 2021 Guy Kornowski, Ohad Shamir

For this approach, we prove under a mild assumption an inherent trade-off between oracle complexity and smoothness: On the one hand, smoothing a nonsmooth nonconvex function can be done very efficiently (e. g., by randomized smoothing), but with dimension-dependent factors in the smoothness parameter, which can strongly affect iteration complexity when plugging into standard smooth optimization methods.

High-Order Oracle Complexity of Smooth and Strongly Convex Optimization

no code implementations13 Oct 2020 Guy Kornowski, Ohad Shamir

In this note, we consider the complexity of optimizing a highly smooth (Lipschitz $k$-th order derivative) and strongly convex function, via calls to a $k$-th order oracle which returns the value and first $k$ derivatives of the function at a given point, and where the dimension is unrestricted.

Vocal Bursts Intensity Prediction

Cannot find the paper you are looking for? You can Submit a new open access paper.