Search Results for author: Nikita Zhivotovskiy

Found 21 papers, 0 papers with code

High-Probability Risk Bounds via Sequential Predictors

no code implementations15 Aug 2023 Dirk van der Hoeven, Nikita Zhivotovskiy, Nicolò Cesa-Bianchi

Online learning methods yield sequential regret bounds under minimal assumptions and provide in-expectation risk bounds for statistical learning.

Density Estimation regression

Local Risk Bounds for Statistical Aggregation

no code implementations29 Jun 2023 Jaouad Mourtada, Tomas Vaškevičius, Nikita Zhivotovskiy

In this paper, we revisit and tighten classical results in the theory of aggregation in the statistical setting by replacing the global complexity with a smaller, local one.

regression

Optimal PAC Bounds Without Uniform Convergence

no code implementations18 Apr 2023 Ishaq Aden-Ali, Yeshwanth Cherapanamjeri, Abhishek Shetty, Nikita Zhivotovskiy

In this paper, we address this issue by providing optimal high probability risk bounds through a framework that surpasses the limitations of uniform convergence arguments.

Binary Classification Classification +1

Exploring Local Norms in Exp-concave Statistical Learning

no code implementations21 Feb 2023 Nikita Puchkin, Nikita Zhivotovskiy

We consider the problem of stochastic convex optimization with exp-concave losses using Empirical Risk Minimization in a convex class.

valid

Statistically Optimal Robust Mean and Covariance Estimation for Anisotropic Gaussians

no code implementations21 Jan 2023 Arshak Minasyan, Nikita Zhivotovskiy

Assume that $X_{1}, \ldots, X_{N}$ is an $\varepsilon$-contaminated sample of $N$ independent Gaussian vectors in $\mathbb{R}^d$ with mean $\mu$ and covariance $\Sigma$.

Robustifying Markowitz

no code implementations28 Dec 2022 Wolfgang Karl Härdle, Yegor Klochkov, Alla Petukhina, Nikita Zhivotovskiy

Markowitz mean-variance portfolios with sample mean and covariance as input parameters feature numerous issues in practice.

Time Series Time Series Analysis

The One-Inclusion Graph Algorithm is not Always Optimal

no code implementations19 Dec 2022 Ishaq Aden-Ali, Yeshwanth Cherapanamjeri, Abhishek Shetty, Nikita Zhivotovskiy

In one of the first COLT open problems, Warmuth conjectured that this prediction strategy always implies an optimal high probability bound on the risk, and hence is also an optimal PAC algorithm.

A Regret-Variance Trade-Off in Online Learning

no code implementations6 Jun 2022 Dirk van der Hoeven, Nikita Zhivotovskiy, Nicolò Cesa-Bianchi

We prove that a variant of EWA either achieves a negative regret (i. e., the algorithm outperforms the best expert), or guarantees a $O(\log K)$ bound on both variance and regret.

Model Selection

Stability and Deviation Optimal Risk Bounds with Convergence Rate $O(1/n)$

no code implementations NeurIPS 2021 Yegor Klochkov, Nikita Zhivotovskiy

The sharpest known high probability generalization bounds for uniformly stable algorithms (Feldman, Vondr\'{a}k, 2018, 2019), (Bousquet, Klochkov, Zhivotovskiy, 2020) contain a generally inevitable sampling error term of order $\Theta(1/\sqrt{n})$.

Generalization Bounds valid

Distribution-Free Robust Linear Regression

no code implementations25 Feb 2021 Jaouad Mourtada, Tomas Vaškevičius, Nikita Zhivotovskiy

In this distribution-free regression setting, we show that boundedness of the conditional second moment of the response given the covariates is a necessary and sufficient condition for achieving nontrivial guarantees.

regression

Exponential Savings in Agnostic Active Learning through Abstention

no code implementations31 Jan 2021 Nikita Puchkin, Nikita Zhivotovskiy

We show that in pool-based active classification without assumptions on the underlying distribution, if the learner is given the power to abstain from some predictions by paying the price marginally smaller than the average loss $1/2$ of a random guess, exponential savings in the number of label requests are possible whenever they are possible in the corresponding realizable problem.

Active Learning Classification +1

On Mean Estimation for Heteroscedastic Random Variables

no code implementations22 Oct 2020 Luc Devroye, Silvio Lattanzi, Gabor Lugosi, Nikita Zhivotovskiy

We study the problem of estimating the common mean $\mu$ of $n$ independent symmetric random variables with different and unknown standard deviations $\sigma_1 \le \sigma_2 \le \cdots \le\sigma_n$.

Suboptimality of Constrained Least Squares and Improvements via Non-Linear Predictors

no code implementations19 Sep 2020 Tomas Vaškevičius, Nikita Zhivotovskiy

We study the problem of predicting as well as the best linear predictor in a bounded Euclidean ball with respect to the squared loss.

Proper Learning, Helly Number, and an Optimal SVM Bound

no code implementations24 May 2020 Olivier Bousquet, Steve Hanneke, Shay Moran, Nikita Zhivotovskiy

It has been recently shown by Hanneke (2016) that the optimal sample complexity of PAC learning for any VC class C is achieved by a particular improper learning algorithm, which outputs a specific majority-vote of hypotheses in C. This leaves the question of when this bound can be achieved by proper learning algorithms, which are restricted to always output a hypothesis from C. In this paper we aim to characterize the classes for which the optimal sample complexity can be achieved by a proper learning algorithm.

PAC learning

Robust $k$-means Clustering for Distributions with Two Moments

no code implementations6 Feb 2020 Yegor Klochkov, Alexey Kroshnin, Nikita Zhivotovskiy

We consider the robust algorithms for the $k$-means clustering problem where a quantizer is constructed based on $N$ independent observations.

Clustering Vocal Bursts Valence Prediction

Fast Rates for Online Prediction with Abstention

no code implementations28 Jan 2020 Gergely Neu, Nikita Zhivotovskiy

In the setting of sequential prediction of individual $\{0, 1\}$-sequences with expert advice, we show that by allowing the learner to abstain from the prediction by paying a cost marginally smaller than $\frac 12$ (say, $0. 49$), it is possible to achieve expected regret bounds that are independent of the time horizon $T$.

Fast classification rates without standard margin assumptions

no code implementations28 Oct 2019 Olivier Bousquet, Nikita Zhivotovskiy

First, we consider classification with a reject option, namely Chow's reject option model, and show that by slightly lowering the impact of hard instances, a learning rate of order $O\left(\frac{d}{n}\log \frac{n}{d}\right)$ is always achievable in the agnostic setting by a specific learning algorithm.

Classification General Classification +1

Sharper bounds for uniformly stable algorithms

no code implementations17 Oct 2019 Olivier Bousquet, Yegor Klochkov, Nikita Zhivotovskiy

In a series of recent breakthrough papers by Feldman and Vondrak (2018, 2019), it was shown that the best known high probability upper bounds for uniformly stable learning algorithms due to Bousquet and Elisseef (2002) are sub-optimal in some natural regimes.

Generalization Bounds Learning Theory

When are epsilon-nets small?

no code implementations28 Nov 2017 Andrey Kupavskii, Nikita Zhivotovskiy

In many interesting situations the size of epsilon-nets depends only on $\epsilon$ together with different complexity measures.

Permutational Rademacher Complexity: a New Complexity Measure for Transductive Learning

no code implementations12 May 2015 Ilya Tolstikhin, Nikita Zhivotovskiy, Gilles Blanchard

This paper introduces a new complexity measure for transductive learning called Permutational Rademacher Complexity (PRC) and studies its properties.

Learning Theory Transductive Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.