Publikation: Working paper › Forskning

- rp13_51
Indsendt manuskript, 640 KB, PDF-dokument

- Mehmet Caner, North Carolina State University, USA
- Anders Bredahl Kock

This paper consider penalized empirical loss minimization of convex loss functions with unknown non-linear target functions. Using the elastic net penalty we establish a finite sample oracle inequality which bounds the loss of our estimator from above with high probability. If the unknown target is linear this inequality also provides an upper bound of the estimation error of the estimated parameter vector. These are new results and they generalize the econometrics and statistics literature. Next, we use the non-asymptotic results to show that the excess loss of our estimator is asymptotically of the same order as that of the oracle. If the target is linear we give sufficient conditions for consistency of the estimated parameter vector. Next, we briefly discuss how a thresholded version of our estimator can be used to perform consistent variable selection. We give two examples of loss functions covered by our framework and show how penalized nonparametric series estimation is contained as a special case and provide a finite sample upper bound on the mean square error of the elastic net series estimator.

Originalsprog | Engelsk |
---|---|

Udgivelsessted | Aarhus |

Udgiver | Institut for Økonomi, Aarhus Universitet |

Antal sider | 44 |

Status | Udgivet - 20 dec. 2013 |

Serietitel | CREATES Research Papers |
---|---|

Nummer | 2013-51 |

- Empirical loss minimization, Lasso, Elastic net, Oracle inequality, Convex loss function, Nonparametric estimation, Variable selection

Se relationer på Aarhus Universitet Citationsformater

Ingen data tilgængelig

ID: 68088072