1 code implementation • 7 Nov 2023 • Nikita Puchkin, Eduard Gorbunov, Nikolay Kutuzov, Alexander Gasnikov
We consider stochastic optimization problems with heavy-tailed noise with structured density.
no code implementations • 21 Feb 2023 • Nikita Puchkin, Nikita Zhivotovskiy
We consider the problem of stochastic convex optimization with exp-concave losses using Empirical Risk Minimization in a convex class.
2 code implementations • 21 Jun 2022 • Artur Goldman, Nikita Puchkin, Valeriia Shcherbakova, Uliana Vinogradova
We suggest a novel procedure for online change point detection.
no code implementations • 31 Jan 2021 • Nikita Puchkin, Nikita Zhivotovskiy
We show that in pool-based active classification without assumptions on the underlying distribution, if the learner is given the power to abstain from some predictions by paying the price marginally smaller than the average loss $1/2$ of a random guess, exponential savings in the number of label requests are possible whenever they are possible in the corresponding realizable problem.
no code implementations • 30 Jan 2021 • Nikita Puchkin, Sergey Samsonov, Denis Belomestny, Eric Moulines, Alexey Naumov
In this work we undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs).
1 code implementation • 15 Dec 2020 • Nikita Puchkin, Aleksandr Timofeev, Vladimir Spokoiny
Prediction for high dimensional time series is a challenging task due to the curse of dimensionality problem.
Denoising Time Series Forecasting Statistics Theory Statistics Theory
1 code implementation • 12 Jun 2019 • Nikita Puchkin, Vladimir Spokoiny
We consider a problem of manifold estimation from noisy observations.
no code implementations • 8 Apr 2018 • Nikita Puchkin, Vladimir Spokoiny
We consider a problem of multiclass classification, where the training sample $S_n = \{(X_i, Y_i)\}_{i=1}^n$ is generated from the model $\mathbb P(Y = m | X = x) = \eta_m(x)$, $1 \leq m \leq M$, and $\eta_1(x), \dots, \eta_M(x)$ are unknown $\alpha$-Holder continuous functions. Given a test point $X$, our goal is to predict its label.