Search Results for author: Aram-Alexandre Pooladian

Found 15 papers, 6 papers with code

Conditional simulation via entropic optimal transport: Toward non-parametric estimation of conditional Brenier maps

no code implementations11 Nov 2024 Ricardo Baptista, Aram-Alexandre Pooladian, Michael Brennan, Youssef Marzouk, Jonathan Niles-Weed

Conditional simulation is a fundamental task in statistical modeling: Generate samples from the conditionals given finitely many data points from a joint distribution.

Bayesian Inference

Plug-in estimation of Schrödinger bridges

1 code implementation21 Aug 2024 Aram-Alexandre Pooladian, Jonathan Niles-Weed

Instead, we show that the potentials obtained from solving the static entropic optimal transport problem between the source and target samples can be modified to yield a natural plug-in estimator of the time-dependent drift that defines the bridge between two measures.

Progressive Entropic Optimal Transport Solvers

no code implementations7 Jun 2024 Parnian Kassraie, Aram-Alexandre Pooladian, Michal Klein, James Thornton, Jonathan Niles-Weed, Marco Cuturi

Optimal transport (OT) has profoundly impacted machine learning by providing theoretical and computational tools to realign datasets.

Neural Optimal Transport with Lagrangian Costs

1 code implementation1 Jun 2024 Aram-Alexandre Pooladian, Carles Domingo-Enrich, Ricky T. Q. Chen, Brandon Amos

We investigate the optimal transport problem between probability measures when the underlying cost function is understood to satisfy a least action principle, also known as a Lagrangian cost.

Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space

1 code implementation5 Dec 2023 Yiheng Jiang, Sinho Chewi, Aram-Alexandre Pooladian

We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods.

Variational Inference

Learning Elastic Costs to Shape Monge Displacements

no code implementations20 Jun 2023 Michal Klein, Aram-Alexandre Pooladian, Pierre Ablin, Eugène Ndiaye, Jonathan Niles-Weed, Marco Cuturi

Given a source and a target probability measure supported on $\mathbb{R}^d$, the Monge problem asks to find the most efficient way to map one distribution to the other.

Multisample Flow Matching: Straightening Flows with Minibatch Couplings

no code implementations28 Apr 2023 Aram-Alexandre Pooladian, Heli Ben-Hamu, Carles Domingo-Enrich, Brandon Amos, Yaron Lipman, Ricky T. Q. Chen

Simulation-free methods for training continuous-time generative models construct probability paths that go between noise distributions and individual data samples.

An Explicit Expansion of the Kullback-Leibler Divergence along its Fisher-Rao Gradient Flow

no code implementations23 Feb 2023 Carles Domingo-Enrich, Aram-Alexandre Pooladian

In this short note, we complement these existing results in the literature by providing an explicit expansion of $\text{KL}(\rho_t^{\text{FR}}\|\pi)$ in terms of $e^{-t}$, where $(\rho_t^{\text{FR}})_{t\geq 0}$ is the FR gradient flow of the KL divergence.

Minimax estimation of discontinuous optimal transport maps: The semi-discrete case

no code implementations26 Jan 2023 Aram-Alexandre Pooladian, Vincent Divol, Jonathan Niles-Weed

We consider the problem of estimating the optimal transport map between two probability distributions, $P$ and $Q$ in $\mathbb R^d$, on the basis of i. i. d.

Optimal transport map estimation in general function spaces

no code implementations7 Dec 2022 Vincent Divol, Jonathan Niles-Weed, Aram-Alexandre Pooladian

To ensure identifiability, we assume that $T = \nabla \varphi_0$ is the gradient of a convex function, in which case $T$ is known as an \emph{optimal transport map}.

Entropic estimation of optimal transport maps

no code implementations24 Sep 2021 Aram-Alexandre Pooladian, Jonathan Niles-Weed

We develop a computationally tractable method for estimating the optimal map between two distributions over $\mathbb{R}^d$ with rigorous finite-sample guarantees.

Learning normalizing flows from Entropy-Kantorovich potentials

no code implementations10 Jun 2020 Chris Finlay, Augusto Gerolin, Adam M. Oberman, Aram-Alexandre Pooladian

We approach the problem of learning continuous normalizing flows from a dual perspective motivated by entropy-regularized optimal transport, in which continuous normalizing flows are cast as gradients of scalar potential functions.

Farkas layers: don't shift the data, fix the geometry

1 code implementation4 Oct 2019 Aram-Alexandre Pooladian, Chris Finlay, Adam M. Oberman

Successfully training deep neural networks often requires either batch normalization, appropriate weight initialization, both of which come with their own challenges.

A principled approach for generating adversarial images under non-smooth dissimilarity metrics

2 code implementations5 Aug 2019 Aram-Alexandre Pooladian, Chris Finlay, Tim Hoheisel, Adam Oberman

This includes, but is not limited to, $\ell_1, \ell_2$, and $\ell_\infty$ perturbations; the $\ell_0$ counting "norm" (i. e. true sparseness); and the total variation seminorm, which is a (non-$\ell_p$) convolutional dissimilarity measuring local pixel changes.

Adversarial Attack

The LogBarrier adversarial attack: making effective use of decision boundary information

1 code implementation ICCV 2019 Chris Finlay, Aram-Alexandre Pooladian, Adam M. Oberman

Adversarial attacks formally correspond to an optimization problem: find a minimum norm image perturbation, constrained to cause misclassification.

Adversarial Attack Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.