no code implementations • 2 Apr 2024 • Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan
Recent work of Klivans, Stavropoulos, and Vasilyan initiated the study of testable learning with distribution shift (TDS learning), where a learner is given labeled samples from training distribution $\mathcal{D}$, unlabeled samples from test distribution $\mathcal{D}'$, and the goal is to output a classifier with low error on $\mathcal{D}'$ whenever the training samples pass a corresponding test.
no code implementations • 25 Nov 2023 • Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan
In this model, a learner outputs a classifier with low test error whenever samples from $D$ and $D'$ pass an associated test; moreover, the test must accept if the marginal of $D$ equals the marginal of $D'$.
no code implementations • 28 Feb 2023 • Aravind Gollakota, Adam R. Klivans, Konstantinos Stavropoulos, Arsen Vasilyan
Prior work on testable learning ignores the labels in the training set and checks that the empirical moments of the covariates are close to the moments of the base distribution.
no code implementations • 24 Oct 2022 • Alkis Kalavasis, Konstantinos Stavropoulos, Manolis Zampetakis
In this work, we address two questions: (i) Are there general families of SIIRVs with unbounded support that can be learned with sample complexity independent of both $n$ and the maximal element of the support?
no code implementations • 2 Nov 2020 • Dimitris Fotakis, Alkis Kalavasis, Konstantinos Stavropoulos
We consider the problem of learning the true ordering of a set of alternatives from largely incomplete and noisy rankings.