no code implementations • 31 Mar 2024 • Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Sihan Liu, Nikos Zarifis
We study the efficient learnability of low-degree polynomial threshold functions (PTFs) in the presence of a constant fraction of adversarial corruptions.
no code implementations • 4 Mar 2024 • Ilias Diakonikolas, Daniel M. Kane, Thanasis Pittas, Nikos Zarifis
We study the problem of estimating the mean of an identity covariance Gaussian in the truncated setting, in the regime when the truncation set comes from a low-complexity family $\mathcal{C}$ of sets.
no code implementations • 27 Feb 2024 • Nikos Zarifis, Puqian Wang, Ilias Diakonikolas, Jelena Diakonikolas
We give an efficient learning algorithm, achieving a constant factor approximation to the optimal loss, that succeeds under a range of distributions (including log-concave distributions) and a broad class of monotone and Lipschitz link functions.
no code implementations • 27 Dec 2023 • Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
In contrast, algorithms that rely only on random examples inherently require $d^{\mathrm{poly}(1/\epsilon)}$ samples and runtime, even for the basic problem of agnostically learning a single ReLU or a halfspace.
no code implementations • 6 Aug 2023 • Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
In contrast, under a worst- or random-ordering, the number of mistakes must be at least $\Omega(d \log n)$, even when the points are drawn uniformly from the unit sphere and the learner only needs to predict the labels for $1\%$ of them.
no code implementations • 28 Jun 2023 • Ilias Diakonikolas, Jelena Diakonikolas, Daniel M. Kane, Puqian Wang, Nikos Zarifis
Our main result is a lower bound for Statistical Query (SQ) algorithms and low-degree polynomial tests suggesting that the quadratic dependence on $1/\epsilon$ in the sample complexity is inherent for computationally efficient algorithms.
no code implementations • 22 Jun 2023 • Ilias Diakonikolas, Daniel M. Kane, Thanasis Pittas, Nikos Zarifis
In the special case where the separation is on the order of $k^{1/2}$, we additionally obtain fine-grained SQ lower bounds with the correct exponent.
no code implementations • 13 Jun 2023 • Puqian Wang, Nikos Zarifis, Ilias Diakonikolas, Jelena Diakonikolas
We study the problem of learning a single neuron with respect to the $L_2^2$-loss in the presence of adversarial label noise.
no code implementations • 17 Jun 2022 • Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
For the ReLU activation, we give an efficient algorithm with sample complexity $\tilde{O}(d\, \polylog(1/\epsilon))$.
no code implementations • 19 Aug 2021 • Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
We study the general problem and establish the following: For $\eta <1/2$, we give a learning algorithm for general halfspaces with sample and computational complexity $d^{O_{\eta}(\log(1/\gamma))}\mathrm{poly}(1/\epsilon)$, where $\gamma =\max\{\epsilon, \min\{\mathbf{Pr}[f(\mathbf{x}) = 1], \mathbf{Pr}[f(\mathbf{x}) = -1]\} \}$ is the bias of the target halfspace $f$.
no code implementations • 10 Feb 2021 • Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
We study the problem of agnostically learning halfspaces under the Gaussian distribution.
no code implementations • 8 Feb 2021 • Ilias Diakonikolas, Daniel M. Kane, Thanasis Pittas, Nikos Zarifis
We study the problem of agnostic learning under the Gaussian distribution.
no code implementations • 4 Oct 2020 • Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
{\em We give the first polynomial-time algorithm for this fundamental learning problem.}
no code implementations • NeurIPS 2020 • Ilias Diakonikolas, Daniel M. Kane, Nikos Zarifis
We study the fundamental problems of agnostically learning halfspaces and ReLUs under Gaussian marginals.
no code implementations • 22 Jun 2020 • Ilias Diakonikolas, Daniel M. Kane, Vasilis Kontonis, Nikos Zarifis
For the case of positive coefficients, we give the first polynomial-time algorithm for this learning problem for $k$ up to $\tilde{O}(\sqrt{\log d})$.
no code implementations • 11 Jun 2020 • Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
In the Tsybakov noise model, each label is independently flipped with some probability which is controlled by an adversary.
no code implementations • NeurIPS 2020 • Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
We study the problem of agnostically learning homogeneous halfspaces in the distribution-specific PAC model.
no code implementations • 13 Feb 2020 • Ilias Diakonikolas, Vasilis Kontonis, Christos Tzamos, Nikos Zarifis
We study the problem of learning halfspaces with Massart noise in the distribution-specific PAC model.