Department of Economics and Business Economics

Model selection in kernel ridge regression

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Standard

Model selection in kernel ridge regression. / Exterkate, Peter.

In: Computational Statistics & Data Analysis, Vol. 68, No. December, 2013, p. 1-16.

Research output: Contribution to journal/Conference contribution in journal/Contribution to newspaperJournal articleResearchpeer-review

Harvard

Exterkate, P 2013, 'Model selection in kernel ridge regression', Computational Statistics & Data Analysis, vol. 68, no. December, pp. 1-16. https://doi.org/10.1016/j.csda.2013.06.006

APA

Exterkate, P. (2013). Model selection in kernel ridge regression. Computational Statistics & Data Analysis, 68(December), 1-16. https://doi.org/10.1016/j.csda.2013.06.006

CBE

Exterkate P. 2013. Model selection in kernel ridge regression. Computational Statistics & Data Analysis. 68(December):1-16. https://doi.org/10.1016/j.csda.2013.06.006

MLA

Exterkate, Peter. "Model selection in kernel ridge regression". Computational Statistics & Data Analysis. 2013, 68(December). 1-16. https://doi.org/10.1016/j.csda.2013.06.006

Vancouver

Exterkate P. Model selection in kernel ridge regression. Computational Statistics & Data Analysis. 2013;68(December):1-16. https://doi.org/10.1016/j.csda.2013.06.006

Author

Exterkate, Peter. / Model selection in kernel ridge regression. In: Computational Statistics & Data Analysis. 2013 ; Vol. 68, No. December. pp. 1-16.

Bibtex

@article{1e5d76bc1be447a2a796d95e33ed7803,
title = "Model selection in kernel ridge regression",
abstract = "Kernel ridge regression is a technique to perform ridge regression with a potentially infinite number of nonlinear transformations of the independent variables as regressors. This method is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts. The influence of the choice of kernel and the setting of tuning parameters on forecast accuracy is investigated. Several popular kernels are reviewed, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. The latter two kernels are interpreted in terms of their smoothing properties, and the tuning parameters associated to all these kernels are related to smoothness measures of the prediction function and to the signal-to-noise ratio. Based on these interpretations, guidelines are provided for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study confirms the practical usefulness of these rules of thumb. Finally, the flexible and smooth functional forms provided by the Gaussian and Sinc kernels makes them widely applicable. Therefore, their use is recommended instead of the popular polynomial kernels in general settings, where no information on the data-generating process is available.",
keywords = "nonlinear forecasting, shrinkage estimation, kernel methods, high dimensionality",
author = "Peter Exterkate",
year = "2013",
doi = "10.1016/j.csda.2013.06.006",
language = "English",
volume = "68",
pages = "1--16",
journal = "Computational Statistics & Data Analysis",
issn = "0167-9473",
publisher = "Elsevier BV",
number = "December",

}

RIS

TY - JOUR

T1 - Model selection in kernel ridge regression

AU - Exterkate, Peter

PY - 2013

Y1 - 2013

N2 - Kernel ridge regression is a technique to perform ridge regression with a potentially infinite number of nonlinear transformations of the independent variables as regressors. This method is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts. The influence of the choice of kernel and the setting of tuning parameters on forecast accuracy is investigated. Several popular kernels are reviewed, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. The latter two kernels are interpreted in terms of their smoothing properties, and the tuning parameters associated to all these kernels are related to smoothness measures of the prediction function and to the signal-to-noise ratio. Based on these interpretations, guidelines are provided for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study confirms the practical usefulness of these rules of thumb. Finally, the flexible and smooth functional forms provided by the Gaussian and Sinc kernels makes them widely applicable. Therefore, their use is recommended instead of the popular polynomial kernels in general settings, where no information on the data-generating process is available.

AB - Kernel ridge regression is a technique to perform ridge regression with a potentially infinite number of nonlinear transformations of the independent variables as regressors. This method is gaining popularity as a data-rich nonlinear forecasting tool, which is applicable in many different contexts. The influence of the choice of kernel and the setting of tuning parameters on forecast accuracy is investigated. Several popular kernels are reviewed, including polynomial kernels, the Gaussian kernel, and the Sinc kernel. The latter two kernels are interpreted in terms of their smoothing properties, and the tuning parameters associated to all these kernels are related to smoothness measures of the prediction function and to the signal-to-noise ratio. Based on these interpretations, guidelines are provided for selecting the tuning parameters from small grids using cross-validation. A Monte Carlo study confirms the practical usefulness of these rules of thumb. Finally, the flexible and smooth functional forms provided by the Gaussian and Sinc kernels makes them widely applicable. Therefore, their use is recommended instead of the popular polynomial kernels in general settings, where no information on the data-generating process is available.

KW - nonlinear forecasting

KW - shrinkage estimation

KW - kernel methods

KW - high dimensionality

U2 - 10.1016/j.csda.2013.06.006

DO - 10.1016/j.csda.2013.06.006

M3 - Journal article

VL - 68

SP - 1

EP - 16

JO - Computational Statistics & Data Analysis

JF - Computational Statistics & Data Analysis

SN - 0167-9473

IS - December

ER -