Search Results for author: Jens Behrmann

Found 13 papers, 5 papers with code

Simulation-based Inference for Cardiovascular Models

no code implementations26 Jul 2023 Antoine Wehenkel, Jens Behrmann, Andrew C. Miller, Guillermo Sapiro, Ozan Sener, Marco Cuturi, Jörn-Henrik Jacobsen

Over the past decades, hemodynamics simulators have steadily evolved and have become tools of choice for studying cardiovascular systems in-silico.

Robust Hybrid Learning With Expert Augmentation

1 code implementation8 Feb 2022 Antoine Wehenkel, Jens Behrmann, Hsiang Hsu, Guillermo Sapiro, Gilles Louppe, Jörn-Henrik Jacobsen

Hybrid modelling reduces the misspecification of expert models by combining them with machine learning (ML) components learned from data.

Data Augmentation valid

Understanding and Mitigating Exploding Inverses in Invertible Neural Networks

1 code implementation16 Jun 2020 Jens Behrmann, Paul Vicol, Kuan-Chieh Wang, Roger Grosse, Jörn-Henrik Jacobsen

For problems where global invertibility is necessary, such as applying normalizing flows on OOD data, we show the importance of designing stable INN building blocks.

On the Invertibility of Invertible Neural Networks

no code implementations25 Sep 2019 Jens Behrmann, Paul Vicol, Kuan-Chieh Wang, Roger B. Grosse, Jörn-Henrik Jacobsen

Guarantees in deep learning are hard to achieve due to the interplay of flexible modeling schemes and complex tasks.

Residual Flows for Invertible Generative Modeling

4 code implementations NeurIPS 2019 Ricky T. Q. Chen, Jens Behrmann, David Duvenaud, Jörn-Henrik Jacobsen

Flow-based generative models parameterize probability distributions through an invertible transformation and can be trained by maximum likelihood.

Density Estimation Image Generation

Invariance and Inverse Stability under ReLU

no code implementations ICLR 2019 Jens Behrmann, Sören Dittmer, Pascal Fernsel, Peter Maass

We flip the usual approach to study invariance and robustness of neural networks by considering the non-uniqueness and instability of the inverse mapping.

Invertible Residual Networks

5 code implementations2 Nov 2018 Jens Behrmann, Will Grathwohl, Ricky T. Q. Chen, David Duvenaud, Jörn-Henrik Jacobsen

We show that standard ResNet architectures can be made invertible, allowing the same model to be used for classification, density estimation, and generation.

Density Estimation General Classification +1

Excessive Invariance Causes Adversarial Vulnerability

no code implementations ICLR 2019 Jörn-Henrik Jacobsen, Jens Behrmann, Richard Zemel, Matthias Bethge

Despite their impressive performance, deep neural networks exhibit striking failures on out-of-distribution inputs.

Analysis of Invariance and Robustness via Invertibility of ReLU-Networks

no code implementations25 Jun 2018 Jens Behrmann, Sören Dittmer, Pascal Fernsel, Peter Maaß

Studying the invertibility of deep neural networks (DNNs) provides a principled approach to better understand the behavior of these powerful models.

Cannot find the paper you are looking for? You can Submit a new open access paper.