Search Results for author: Meyer Scetbon

Found 17 papers, 5 papers with code

FiP: a Fixed-Point Approach for Causal Generative Modeling

no code implementations10 Apr 2024 Meyer Scetbon, Joel Jennings, Agrin Hilmkil, Cheng Zhang, Chao Ma

Based on this, we design a two-stage causal generative model that first infers the causal order from observations in a zero-shot manner, thus by-passing the search, and then learns the generative fixed-point SCM on the ordered variables.

The Essential Role of Causality in Foundation World Models for Embodied AI

no code implementations6 Feb 2024 Tarun Gupta, Wenbo Gong, Chao Ma, Nick Pawlowski, Agrin Hilmkil, Meyer Scetbon, Ade Famoti, Ashley Juan Llorens, Jianfeng Gao, Stefan Bauer, Danica Kragic, Bernhard Schölkopf, Cheng Zhang

This paper focuses on the prospects of building foundation world models for the upcoming generation of embodied agents and presents a novel viewpoint on the significance of causality within these.

Misconceptions

Robust Linear Regression: Phase-Transitions and Precise Tradeoffs for General Norms

no code implementations1 Aug 2023 Elvis Dohmatob, Meyer Scetbon

In this paper, we investigate the impact of test-time adversarial attacks on linear regression models and determine the optimal level of robustness that any model can reach while maintaining a given level of standard predictive performance (accuracy).

Adversarial Robustness regression

Polynomial-Time Solvers for the Discrete $\infty$-Optimal Transport Problems

no code implementations26 Apr 2023 Meyer Scetbon

In this note, we propose polynomial-time algorithms solving the Monge and Kantorovich formulations of the $\infty$-optimal transport problem in the discrete and finite setting.

Robust Linear Regression: Gradient-descent, Early-stopping, and Beyond

no code implementations31 Jan 2023 Meyer Scetbon, Elvis Dohmatob

However, we show that this strategy can be arbitrarily sub-optimal in the case of general Mahalanobis attacks.

regression

Low-rank Optimal Transport: Approximation, Statistics and Debiasing

no code implementations24 May 2022 Meyer Scetbon, Marco Cuturi

The matching principles behind optimal transport (OT) play an increasingly important role in machine learning, a trend which can be observed when OT is used to disambiguate datasets in applications (e. g. single-cell genomics) or used to improve more complex methods (e. g. balanced attention in transformers or self-supervised learning).

Self-Supervised Learning

Triangular Flows for Generative Modeling: Statistical Consistency, Smoothness Classes, and Fast Rates

no code implementations31 Dec 2021 Nicholas J. Irons, Meyer Scetbon, Soumik Pal, Zaid Harchaoui

Triangular flows, also known as Kn\"{o}the-Rosenblatt measure couplings, comprise an important building block of normalizing flow models for generative modeling and density estimation, including popular autoregressive flow models such as real-valued non-volume preserving transformation models (Real NVP).

Density Estimation

An Asymptotic Test for Conditional Independence using Analytic Kernel Embeddings

1 code implementation28 Oct 2021 Meyer Scetbon, Laurent Meunier, Yaniv Romano

We propose a new conditional dependence measure and a statistical test for conditional independence.

Linear-Time Gromov Wasserstein Distances using Low Rank Couplings and Costs

1 code implementation NeurIPS 2021 Meyer Scetbon, Gabriel Peyré, Marco Cuturi

The ability to align points across two related yet incomparable point clouds (e. g. living in different spaces) plays an important role in machine learning.

Low-Rank Sinkhorn Factorization

1 code implementation8 Mar 2021 Meyer Scetbon, Marco Cuturi, Gabriel Peyré

Because matrix-vector products are pervasive in the Sinkhorn algorithm, several works have proposed to \textit{approximate} kernel matrices appearing in its iterations using low-rank factors.

Equitable and Optimal Transport with Multiple Agents

no code implementations12 Jun 2020 Meyer Scetbon, Laurent Meunier, Jamal Atif, Marco Cuturi

When there is only one agent, we recover the Optimal Transport problem.

Linear Time Sinkhorn Divergences using Positive Features

1 code implementation NeurIPS 2020 Meyer Scetbon, Marco Cuturi

Although Sinkhorn divergences are now routinely used in data sciences to compare probability distributions, the computational effort required to compute them remains expensive, growing in general quadratically in the size $n$ of the support of these distributions.

Harmonic Decompositions of Convolutional Networks

no code implementations ICML 2020 Meyer Scetbon, Zaid Harchaoui

We present a description of the function space and the smoothness class associated with a convolutional network using the machinery of reproducing kernel Hilbert spaces.

A Spectral Analysis of Dot-product Kernels

no code implementations28 Feb 2020 Meyer Scetbon, Zaid Harchaoui

We present eigenvalue decay estimates of integral operators associated with compositional dot-product kernels.

Comparing distributions: \ell_1 geometry improves kernel two-sample testing

1 code implementation NeurIPS 2019 Meyer Scetbon, Gael Varoquaux

Here, we show that $L^p$ distances (with $p\geq 1$) between these distribution representatives give metrics on the space of distributions that are well-behaved to detect differences between distributions as they metrize the weak convergence.

Two-sample testing Vocal Bursts Valence Prediction

Deep K-SVD Denoising

no code implementations28 Sep 2019 Meyer Scetbon, Michael Elad, Peyman Milanfar

The question we address in this paper is whether K-SVD was brought to its peak in its original conception, or whether it can be made competitive again.

Denoising

Cannot find the paper you are looking for? You can Submit a new open access paper.