Universality of Bayesian mixture predictors - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2017

Universality of Bayesian mixture predictors

Daniil Ryabko
  • Fonction : Auteur
  • PersonId : 848126

Résumé

The problem is that of sequential probability forecasting for finite-valued time series. The data is generated by an unknown probability distribution over the space of all one-way infinite sequences. It is known that this measure belongs to a given set C, but the latter is completely arbitrary (uncountably infinite, without any structure given). The performance is measured with asymptotic average log loss. In this work it is shown that the minimax asymptotic performance is always attainable, and it is attained by a convex combination of a countably many measures from the set C (a Bayesian mixture). This was previously only known for the case when the best achievable asymptotic error is 0. This also contrasts previous results that show that in the non-realizable case all Bayesian mixtures may be suboptimal, while there is a predictor that achieves the optimal performance.

Dates et versions

hal-01627332 , version 1 (01-11-2017)

Identifiants

Citer

Daniil Ryabko. Universality of Bayesian mixture predictors. ALT 2017 - 28th International Conference on Algorithmic Learning Theory, Oct 2017, Kyoto, Japan. pp.1-13. ⟨hal-01627332⟩
61 Consultations
0 Téléchargements

Altmetric

Partager

More