Search Results for author: Andreas Lehrmann

Found 10 papers, 5 papers with code

DynaConF: Dynamic Forecasting of Non-Stationary Time Series

1 code implementation17 Sep 2022 SiQi Liu, Andreas Lehrmann

Deep learning has shown impressive results in a variety of time series forecasting tasks, where modeling the conditional distribution of the future given the past is the essence.

Time Series Time Series Forecasting

Efficient CDF Approximations for Normalizing Flows

1 code implementation23 Feb 2022 Chandramouli Shama Sastry, Andreas Lehrmann, Marcus Brubaker, Alexander Radovic

Instead, we build upon the diffeomorphic properties of normalizing flows and leverage the divergence theorem to estimate the CDF over a closed region in target space in terms of the flux across its \emph{boundary}, as induced by the normalizing flow.

Agent Forecasting at Flexible Horizons using ODE Flows

no code implementations ICML Workshop INNF 2021 Alexander Radovic, JiaWei He, Janahan Ramanan, Marcus A Brubaker, Andreas Lehrmann

In this work we describe OMEN, a neural ODE based normalizing flow for the prediction of marginal distributions at flexible evaluation horizons, and apply it to agent position forecasting.

Position

Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

1 code implementation NeurIPS 2020 Ruizhi Deng, Bo Chang, Marcus A. Brubaker, Greg Mori, Andreas Lehrmann

Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation.

Density Estimation Irregular Time Series +2

Neural Volumes: Learning Dynamic Renderable Volumes from Images

1 code implementation18 Jun 2019 Stephen Lombardi, Tomas Simon, Jason Saragih, Gabriel Schwartz, Andreas Lehrmann, Yaser Sheikh

Modeling and rendering of dynamic scenes is challenging, as natural scenes often contain complex phenomena such as thin structures, evolving topology, translucency, scattering, occlusion, and biological motion.

Variational Autoencoders with Jointly Optimized Latent Dependency Structure

no code implementations ICLR 2019 Jiawei He, Yu Gong, Joseph Marino, Greg Mori, Andreas Lehrmann

In particular, we express the latent variable space of a variational autoencoder (VAE) in terms of a Bayesian network with a learned, flexible dependency structure.

Non-parametric Structured Output Networks

no code implementations NeurIPS 2017 Andreas Lehrmann, Leonid Sigal

End-to-end training methods for models with structured graphical dependencies on top of neural predictions have recently emerged as a principled way of combining these two paradigms.

Visual Reference Resolution using Attention Memory for Visual Dialog

no code implementations NeurIPS 2017 Paul Hongsuck Seo, Andreas Lehrmann, Bohyung Han, Leonid Sigal

From this memory, the model retrieves the previous attention, taking into account recency, which is most relevant for the current question, in order to resolve potentially ambiguous references.

Ranked #13 on Visual Dialog on VisDial v0.9 val (R@1 metric)

Parameter Prediction Question Answering +3

Cannot find the paper you are looking for? You can Submit a new open access paper.