Nonparametric estimation of a multivariate density under Kullback-Leibler loss with ISDE
Résumé
In this paper, we propose a theoretical analysis of the algorithm ISDE, introduced in previous work. From a dataset, ISDE learns a density written as a product of marginal density estimators over a partition of the features. We show that under some hypotheses, the Kullback-Leibler loss between the proper density and the output of ISDE is a bias term plus the sum of two terms which goes to zero as the number of samples goes to infinity. The rate of convergence indicates that ISDE tackles the curse of dimensionality by reducing the dimension from the one of the ambient space to the one of the biggest blocks in the partition. The constants reflect a combinatorial complexity reduction linked to the design of ISDE.
Origine | Fichiers produits par l'(les) auteur(s) |
---|