Integral Mixability: a Tool for Efficient Online Aggregation of Functional and Probabilistic Forecasts

15 Dec 2019  ·  Alexander Korotin, Vladimir V'yugin, Evgeny Burnaev ·

In this paper we extend the setting of the online prediction with expert advice to function-valued forecasts. At each step of the online game several experts predict a function, and the learner has to efficiently aggregate these functional forecasts into a single forecast. We adapt basic mixable (and exponentially concave) loss functions to compare functional predictions and prove that these adaptations are also mixable (exp-concave). We call this phenomena integral mixability (exp-concavity). As an application of our main result, we prove that various loss functions used for probabilistic forecasting are mixable (exp-concave). The considered losses include Sliced Continuous Ranking Probability Score, Energy-Based Distance, Optimal Transport Costs & Sliced Wasserstein-2 distance, Beta-2 & Kullback-Leibler divergences, Characteristic function and Maximum Mean Discrepancies.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here