Search Results for author: Nikos Zarifis

Found 18 papers, 0 papers with code

Super Non-singular Decompositions of Polynomials and their Application to Robustly Learning Low-degree PTFs

no code implementations31 Mar 2024 Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Sihan Liu, Nikos Zarifis

We study the efficient learnability of low-degree polynomial threshold functions (PTFs) in the presence of a constant fraction of adversarial corruptions.

PAC learning

Statistical Query Lower Bounds for Learning Truncated Gaussians

no code implementations4 Mar 2024 Ilias Diakonikolas, Daniel M. Kane, Thanasis Pittas, Nikos Zarifis

We study the problem of estimating the mean of an identity covariance Gaussian in the truncated setting, in the regime when the truncation set comes from a low-complexity family $\mathcal{C}$ of sets.

Robustly Learning Single-Index Models via Alignment Sharpness

no code implementations27 Feb 2024 Nikos Zarifis, Puqian Wang, Ilias Diakonikolas, Jelena Diakonikolas

We give an efficient learning algorithm, achieving a constant factor approximation to the optimal loss, that succeeds under a range of distributions (including log-concave distributions) and a broad class of monotone and Lipschitz link functions.

Agnostically Learning Multi-index Models with Queries

no code implementations27 Dec 2023 Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis

In contrast, algorithms that rely only on random examples inherently require $d^{\mathrm{poly}(1/\epsilon)}$ samples and runtime, even for the basic problem of agnostically learning a single ReLU or a halfspace.

Dimensionality Reduction

Self-Directed Linear Classification

no code implementations6 Aug 2023 Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis

In contrast, under a worst- or random-ordering, the number of mistakes must be at least $\Omega(d \log n)$, even when the points are drawn uniformly from the unit sphere and the learner only needs to predict the labels for $1\%$ of them.

Classification

Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise

no code implementations28 Jun 2023 Ilias Diakonikolas, Jelena Diakonikolas, Daniel M. Kane, Puqian Wang, Nikos Zarifis

Our main result is a lower bound for Statistical Query (SQ) algorithms and low-degree polynomial tests suggesting that the quadratic dependence on $1/\epsilon$ in the sample complexity is inherent for computationally efficient algorithms.

PAC learning

SQ Lower Bounds for Learning Bounded Covariance GMMs

no code implementations22 Jun 2023 Ilias Diakonikolas, Daniel M. Kane, Thanasis Pittas, Nikos Zarifis

In the special case where the separation is on the order of $k^{1/2}$, we additionally obtain fine-grained SQ lower bounds with the correct exponent.

Robustly Learning a Single Neuron via Sharpness

no code implementations13 Jun 2023 Puqian Wang, Nikos Zarifis, Ilias Diakonikolas, Jelena Diakonikolas

We study the problem of learning a single neuron with respect to the $L_2^2$-loss in the presence of adversarial label noise.

Learning a Single Neuron with Adversarial Label Noise via Gradient Descent

no code implementations17 Jun 2022 Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis

For the ReLU activation, we give an efficient algorithm with sample complexity $\tilde{O}(d\, \polylog(1/\epsilon))$.

Learning General Halfspaces with General Massart Noise under the Gaussian Distribution

no code implementations19 Aug 2021 Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis

We study the general problem and establish the following: For $\eta <1/2$, we give a learning algorithm for general halfspaces with sample and computational complexity $d^{O_{\eta}(\log(1/\gamma))}\mathrm{poly}(1/\epsilon)$, where $\gamma =\max\{\epsilon, \min\{\mathbf{Pr}[f(\mathbf{x}) = 1], \mathbf{Pr}[f(\mathbf{x}) = -1]\} \}$ is the bias of the target halfspace $f$.

PAC learning

Algorithms and SQ Lower Bounds for PAC Learning One-Hidden-Layer ReLU Networks

no code implementations22 Jun 2020 Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Nikos Zarifis

For the case of positive coefficients, we give the first polynomial-time algorithm for this learning problem for $k$ up to $\tilde{O}(\sqrt{\log d})$.

Open-Ended Question Answering PAC learning

Learning Halfspaces with Tsybakov Noise

no code implementations11 Jun 2020 Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis

In the Tsybakov noise model, each label is independently flipped with some probability which is controlled by an adversary.

PAC learning

Non-Convex SGD Learns Halfspaces with Adversarial Label Noise

no code implementations NeurIPS 2020 Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis

We study the problem of agnostically learning homogeneous halfspaces in the distribution-specific PAC model.

Cannot find the paper you are looking for? You can Submit a new open access paper.