no code implementations • 5 Jun 2020 • Paul Sanzenbacher, Lars Mescheder, Andreas Geiger
In recent years, deep generative models have gained significance due to their ability to synthesize natural-looking images with applications ranging from virtual reality to data augmentation for training computer vision models.
3 code implementations • 27 Mar 2020 • Michael Oechsle, Michael Niemeyer, Lars Mescheder, Thilo Strauss, Andreas Geiger
In this work, we propose a novel implicit representation for capturing the visual appearance of an object in terms of its surface light field.
6 code implementations • ECCV 2020 • Songyou Peng, Michael Niemeyer, Lars Mescheder, Marc Pollefeys, Andreas Geiger
Recently, implicit neural representations have gained popularity for learning-based 3D reconstruction.
1 code implementation • CVPR 2020 • Michael Niemeyer, Lars Mescheder, Michael Oechsle, Andreas Geiger
In this work, we propose a differentiable rendering formulation for implicit shape and texture representations.
1 code implementation • CVPR 2020 • Yiyi Liao, Katja Schwarz, Lars Mescheder, Andreas Geiger
We define the new task of 3D controllable image synthesis and propose an approach for solving it by reasoning both in 3D space and in the 2D image domain.
no code implementations • ICCV 2019 • Michael Oechsle, Lars Mescheder, Michael Niemeyer, Thilo Strauss, Andreas Geiger
A major reason for these limitations is that common representations of texture are inefficient or hard to interface for modern deep learning techniques.
7 code implementations • CVPR 2019 • Lars Mescheder, Michael Oechsle, Michael Niemeyer, Sebastian Nowozin, Andreas Geiger
With the advent of deep neural networks, learning-based approaches for 3D reconstruction have gained popularity.
9 code implementations • ICML 2018 • Lars Mescheder, Andreas Geiger, Sebastian Nowozin
In this paper, we show that the requirement of absolute continuity is necessary: we describe a simple yet prototypical counterexample showing that in the more realistic case of distributions that are not absolutely continuous, unregularized GAN training is not always convergent.
no code implementations • 4 Aug 2017 • Hassan Abu Alhaija, Siva Karthik Mustikovela, Lars Mescheder, Andreas Geiger, Carsten Rother
Further, we demonstrate the utility of our approach on training standard deep models for semantic instance segmentation and object detection of cars in outdoor driving scenes.
4 code implementations • NeurIPS 2017 • Lars Mescheder, Sebastian Nowozin, Andreas Geiger
In this paper, we analyze the numerics of common algorithms for training Generative Adversarial Networks (GANs).
1 code implementation • ICML 2017 • Lars Mescheder, Sebastian Nowozin, Andreas Geiger
We show that in the nonparametric limit our method yields an exact maximum-likelihood assignment for the parameters of the generative model, as well as the exact posterior distribution over the latent variables given an observation.
no code implementations • 21 Nov 2016 • Lars Mescheder, Sebastian Nowozin, Andreas Geiger
We present a new notion of probabilistic duality for random variables involving mixture distributions.