Oracle Inequalities for Convex Loss Functions with Non-Linear Targets

Publikation: Working paperForskning

Dokumenter

  • rp13_51

    Indsendt manuskript, 640 KB, PDF-dokument

This paper consider penalized empirical loss minimization of convex loss functions with unknown non-linear target functions. Using the elastic net penalty we establish a finite sample oracle inequality which bounds the loss of our estimator from above with high probability. If the unknown target is linear this inequality also provides an upper bound of the estimation error of the estimated parameter vector. These are new results and they generalize the econometrics and statistics literature. Next, we use the non-asymptotic results to show that the excess loss of our estimator is asymptotically of the same order as that of the oracle. If the target is linear we give sufficient conditions for consistency of the estimated parameter vector. Next, we briefly discuss how a thresholded version of our estimator can be used to perform consistent variable selection. We give two examples of loss functions covered by our framework and show how penalized nonparametric series estimation is contained as a special case and provide a finite sample upper bound on the mean square error of the elastic net series estimator.
OriginalsprogEngelsk
UdgivelsesstedAarhus
UdgiverInstitut for Økonomi, Aarhus Universitet
Antal sider44
StatusUdgivet - 20 dec. 2013
SerietitelCREATES Research Papers
Nummer2013-51

    Forskningsområder

  • Empirical loss minimization, Lasso, Elastic net, Oracle inequality, Convex loss function, Nonparametric estimation, Variable selection

Se relationer på Aarhus Universitet Citationsformater

Download-statistik

Ingen data tilgængelig

ID: 68088072