no code implementations • 22 Feb 2024 • Lu Yu, Arnak Dalalyan
We explore the sampling problem within the framework where parallel evaluations of the gradient of the log-density are feasible.
no code implementations • 31 Jul 2023 • Elen Vardanyan, Arshak Minasyan, Sona Hunanyan, Tigran Galstyan, Arnak Dalalyan
Generative modeling is a widely-used machine learning method with various applications in scientific and industrial fields.
no code implementations • 14 Jun 2023 • Lu Yu, Avetik Karagulyan, Arnak Dalalyan
To provide a more thorough explanation of our method for establishing the computable upper bound, we conduct an analysis of the midpoint discretization for the vanilla Langevin process.
no code implementations • 24 Oct 2022 • Arshak Minasyan, Tigran Galstyan, Sona Hunanyan, Arnak Dalalyan
If $n$ and $m$ are the sizes of these two sets, we assume that the matching map that should be recovered is defined on a subset of unknown cardinality $k^*\le \min(n, m)$.
no code implementations • NeurIPS 2021 • Tigran Galstyan, Arshak Minasyan, Arnak Dalalyan
The matching map is then an injection, which can be consistently estimated only if the vectors of the second set are well separated.
no code implementations • 19 Oct 2020 • Nicolas Schreuder, Victor-Emmanuel Brunel, Arnak Dalalyan
In this paper, we introduce a convenient framework for studying (adversarial) generative models from a statistical perspective.
no code implementations • NeurIPS 2019 • Arnak Dalalyan, Philip Thompson
We study the problem of estimating a $p$-dimensional $s$-sparse vector in a linear model with Gaussian design.
no code implementations • NeurIPS 2012 • Arnak Dalalyan, Yin Chen
In this paper, we develop a novel approach to the problem of learning sparse representations in the context of fused sparsity and unknown noise level.
no code implementations • NeurIPS 2009 • Arnak Dalalyan, Renaud Keriven
We propose a new approach to the problem of robust estimation in multiview geometry.