Approximate Inference for Spectral Mixture Kernel

12 Jun 2020  ·  Yohan Jung, Kyungwoo Song, Jinkyoo Park ·

A spectral mixture (SM) kernel is a flexible kernel used to model any stationary covariance function. Although it is useful in modeling data, the learning of the SM kernel is generally difficult because optimizing a large number of parameters for the SM kernel typically induces an over-fitting, particularly when a gradient-based optimization is used. Also, a longer training time is required. To improve the training, we propose an approximate Bayesian inference for the SM kernel. Specifically, we employ the variational distribution of the spectral points to approximate SM kernel with a random Fourier feature. We optimize the variational parameters by applying a sampling-based variational inference to the derived evidence lower bound (ELBO) estimator constructed from the approximate kernel. To improve the inference, we further propose two additional strategies: (1) a sampling strategy of spectral points to estimate the ELBO estimator reliably and thus its associated gradient, and (2) an approximate natural gradient to accelerate the convergence of the parameters. The proposed inference combined with two strategies accelerates the convergence of the parameters and leads to better optimal parameters.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here