Optimal model selection in heteroscedastic regression using piecewise polynomials
Abstract
We consider the estimation of a regression function with random design and heteroscedastic noise in a nonparametric setting. More precisely, we address the problem of characterizing the optimal penalty when the regression function is estimated by using a penalized least-squares model selection method. In this context, we show the existence of a minimal penalty, de
ned to be the maximum level of penalization under which the model selection procedure totally misbehaves. The optimal penalty is shown to be twice the minimal one and to satisfy a non-asymptotic pathwise oracle inequality with leading constant almost one. Finally, the ideal penalty being unknown in general, we propose a hold-out penalization procedure and show that the latter is asymptotically optimal.
Origin | Files produced by the author(s) |
---|
Loading...