An ℓ 1 -oracle inequality for the Lasso in finite mixture of multivariate Gaussian regression models.
Résumé
We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an ℓ 1 -oracle inequality satisfied by the Lasso estimator according to the Kullback-Leibler loss. This result is an extension of the ℓ 1 -oracle inequality established by Meynet in the multivariate case. We focus on the Lasso for its ℓ 1 -regularization properties rather than for the variable selection procedure, as it was done in Städler et al.
Origine | Fichiers produits par l'(les) auteur(s) |
---|