Search Results for author: Jeremiah Birrell

Found 7 papers, 2 papers with code

Adversarially Robust Deep Learning with Optimal-Transport-Regularized Divergences

no code implementations7 Sep 2023 Jeremiah Birrell, MohammadReza Ebrahimi

We introduce the $ARMOR_D$ methods as novel approaches to enhancing the adversarial robustness of deep learning models.

Adversarial Robustness Malware Detection

Function-space regularized Rényi divergences

1 code implementation10 Oct 2022 Jeremiah Birrell, Yannis Pantazis, Paul Dupuis, Markos A. Katsoulakis, Luc Rey-Bellet

We propose a new family of regularized R\'enyi divergences parametrized not only by the order $\alpha$ but also by a variational function space.

Structure-preserving GANs

no code implementations2 Feb 2022 Jeremiah Birrell, Markos A. Katsoulakis, Luc Rey-Bellet, Wei Zhu

Generative adversarial networks (GANs), a class of distribution-learning methods based on a two-player game between a generator and a discriminator, can generally be formulated as a minmax problem based on the variational representation of a divergence between the unknown and the generated distributions.

A Variance Reduction Method for Neural-based Divergence Estimation

no code implementations29 Sep 2021 Jeremiah Birrell, Markos A. Katsoulakis, Yannis Pantazis, Dipjyoti Paul, Anastasios Tsourtis

Unfortunately, the approximation of expectations that are inherent in variational formulas by statistical averages can be problematic due to high statistical variance, e. g., exponential for the Kullback-Leibler divergence and certain estimators.

Representation Learning

$(f,Γ)$-Divergences: Interpolating between $f$-Divergences and Integral Probability Metrics

no code implementations11 Nov 2020 Jeremiah Birrell, Paul Dupuis, Markos A. Katsoulakis, Yannis Pantazis, Luc Rey-Bellet

We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both $f$-divergences and integral probability metrics (IPMs), such as the $1$-Wasserstein distance.

Image Generation Uncertainty Quantification

Variational Representations and Neural Network Estimation of Rényi Divergences

1 code implementation7 Jul 2020 Jeremiah Birrell, Paul Dupuis, Markos A. Katsoulakis, Luc Rey-Bellet, Jie Wang

We further show that this R\'enyi variational formula holds over a range of function spaces; this leads to a formula for the optimizer under very weak assumptions and is also key in our development of a consistency theory for R\'enyi divergence estimators.

Optimizing Variational Representations of Divergences and Accelerating their Statistical Estimation

no code implementations15 Jun 2020 Jeremiah Birrell, Markos A. Katsoulakis, Yannis Pantazis

Recently, they have gained popularity in machine learning as a tractable and scalable approach for training probabilistic models and for statistically differentiating between data distributions.

Cannot find the paper you are looking for? You can Submit a new open access paper.