Aarhus Universitets segl

Oracle inequalities for high-dimensional panel data models

Publikation: Working paper/Preprint Working paperForskning

Standard

Oracle inequalities for high-dimensional panel data models. / Kock, Anders Bredahl.

Aarhus : Institut for Økonomi, Aarhus Universitet, 2013.

Publikation: Working paper/Preprint Working paperForskning

Harvard

Kock, AB 2013 'Oracle inequalities for high-dimensional panel data models' CREATES Research Papers, nr. 2013-20, Institut for Økonomi, Aarhus Universitet, Aarhus.

APA

Kock, A. B. (2013). Oracle inequalities for high-dimensional panel data models. Institut for Økonomi, Aarhus Universitet. CREATES Research Papers Nr. 2013-20

CBE

Kock AB. 2013. Oracle inequalities for high-dimensional panel data models. Aarhus: Institut for Økonomi, Aarhus Universitet.

MLA

Kock, Anders Bredahl Oracle inequalities for high-dimensional panel data models. Aarhus: Institut for Økonomi, Aarhus Universitet. (CREATES Research Papers; Journal nr. 2013-20). 2013., 37 s.

Vancouver

Kock AB. Oracle inequalities for high-dimensional panel data models. Aarhus: Institut for Økonomi, Aarhus Universitet. 2013 jun. 13.

Author

Kock, Anders Bredahl. / Oracle inequalities for high-dimensional panel data models. Aarhus : Institut for Økonomi, Aarhus Universitet, 2013. (CREATES Research Papers; Nr. 2013-20).

Bibtex

@techreport{ad96c08e708a47c585313c22945b26a1,
title = "Oracle inequalities for high-dimensional panel data models",
abstract = "This paper is concerned with high-dimensional panel data models where the number of regressors can be much larger than the sample size. Under the assumption that the true parameter vector is sparse we establish finite sample upper bounds on the estimation error of the Lasso under two different sets of conditions on the covariates as well as the error terms. Upper bounds on the estimation error of the unobserved heterogeneity are also provided under the assumption of sparsity. Next, we show that our upper bounds are essentially optimal in the sense that they can only be improved by multiplicative constants. These results are then used to show that the Lasso can be consistent in even very large models where the number of regressors increases at an exponential rate in the sample size. Conditions under which the Lasso does not discard any relevant variables asymptotically are also provided. In the second part of the paper we give lower bounds on the probability with which the adaptive Lasso selects the correct sparsity pattern in finite samples. These results are then used to give conditions under which the adaptive Lasso can detect the correct sparsity pattern asymptotically. We illustrate our finite sample results by simulations and apply the methods to search for covariates explaining growth in the G8 countries.",
keywords = "Panel data, Lasso, Adaptive Lasso, Oracle inequality, Nonasymptotic bounds, High-dimensional models, Sparse models, Consistency, Variable selection, Asymptotic sign consistency.",
author = "Kock, {Anders Bredahl}",
year = "2013",
month = jun,
day = "13",
language = "English",
series = "CREATES Research Papers",
publisher = "Institut for {\O}konomi, Aarhus Universitet",
number = "2013-20",
type = "WorkingPaper",
institution = "Institut for {\O}konomi, Aarhus Universitet",

}

RIS

TY - UNPB

T1 - Oracle inequalities for high-dimensional panel data models

AU - Kock, Anders Bredahl

PY - 2013/6/13

Y1 - 2013/6/13

N2 - This paper is concerned with high-dimensional panel data models where the number of regressors can be much larger than the sample size. Under the assumption that the true parameter vector is sparse we establish finite sample upper bounds on the estimation error of the Lasso under two different sets of conditions on the covariates as well as the error terms. Upper bounds on the estimation error of the unobserved heterogeneity are also provided under the assumption of sparsity. Next, we show that our upper bounds are essentially optimal in the sense that they can only be improved by multiplicative constants. These results are then used to show that the Lasso can be consistent in even very large models where the number of regressors increases at an exponential rate in the sample size. Conditions under which the Lasso does not discard any relevant variables asymptotically are also provided. In the second part of the paper we give lower bounds on the probability with which the adaptive Lasso selects the correct sparsity pattern in finite samples. These results are then used to give conditions under which the adaptive Lasso can detect the correct sparsity pattern asymptotically. We illustrate our finite sample results by simulations and apply the methods to search for covariates explaining growth in the G8 countries.

AB - This paper is concerned with high-dimensional panel data models where the number of regressors can be much larger than the sample size. Under the assumption that the true parameter vector is sparse we establish finite sample upper bounds on the estimation error of the Lasso under two different sets of conditions on the covariates as well as the error terms. Upper bounds on the estimation error of the unobserved heterogeneity are also provided under the assumption of sparsity. Next, we show that our upper bounds are essentially optimal in the sense that they can only be improved by multiplicative constants. These results are then used to show that the Lasso can be consistent in even very large models where the number of regressors increases at an exponential rate in the sample size. Conditions under which the Lasso does not discard any relevant variables asymptotically are also provided. In the second part of the paper we give lower bounds on the probability with which the adaptive Lasso selects the correct sparsity pattern in finite samples. These results are then used to give conditions under which the adaptive Lasso can detect the correct sparsity pattern asymptotically. We illustrate our finite sample results by simulations and apply the methods to search for covariates explaining growth in the G8 countries.

KW - Panel data, Lasso, Adaptive Lasso, Oracle inequality, Nonasymptotic bounds, High-dimensional models, Sparse models, Consistency, Variable selection, Asymptotic sign consistency.

M3 - Working paper

T3 - CREATES Research Papers

BT - Oracle inequalities for high-dimensional panel data models

PB - Institut for Økonomi, Aarhus Universitet

CY - Aarhus

ER -