Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis

11 Oct 2019  ·  Song Fang, Quanyan Zhu ·

In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction via an information-theoretic approach. The fundamental bounds are shown to depend only on the conditional entropy of the data point to be predicted given the previous data points... In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed. read more

PDF Abstract
No code implementations yet. Submit your code now



  Add Datasets introduced or used in this paper

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.


No methods listed for this paper. Add relevant methods here