Bézier Curve Gaussian Processes

3 May 2022  ·  Ronny Hug, Stefan Becker, Wolfgang Hübner, Michael Arens, Jürgen Beyerer ·

Probabilistic models for sequential data are the basis for a variety of applications concerned with processing timely ordered information. The predominant approach in this domain is given by recurrent neural networks, implementing either an approximate Bayesian approach (e.g. Variational Autoencoders or Generative Adversarial Networks) or a regression-based approach, i.e. variations of Mixture Density networks (MDN). In this paper, we focus on the $\mathcal{N}$-MDN variant, which parameterizes (mixtures of) probabilistic B\'ezier curves ($\mathcal{N}$-Curves) for modeling stochastic processes. While in favor in terms of computational cost and stability, MDNs generally fall behind approximate Bayesian approaches in terms of expressiveness. Towards this end, we present an approach for closing this gap by enabling full Bayesian inference on top of $\mathcal{N}$-MDNs. For this, we show that $\mathcal{N}$-Curves are a special case of Gaussian processes (denoted as $\mathcal{N}$-GP) and then derive corresponding mean and kernel functions for different modalities. Following this, we propose the use of the $\mathcal{N}$-MDN as a data-dependent generator for $\mathcal{N}$-GP prior distributions. We show the advantages granted by this combined model in an application context, using human trajectory prediction as an example.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here