Abstract
Random forest regression (RF) is an extremely popular tool for the analysis of highdimensional data. Nonetheless, its benefits may be lessened in sparse settings due to weak predictors, and a pre-estimation dimension reduction (targeting) step is required. We show that proper targeting controls the probability of placing splits along strong predictors, thus providing an important complement to RF’s feature sampling. This is supported by simulations using representative finite samples. Moreover, we quantify the immediate gain from targeting in terms of increased strength of individual trees. Macroeconomic and financial applications show that the bias-variance trade-off implied by targeting, due to increased correlation among trees in the forest, is balanced at a medium degree of targeting, selecting the best 5–30% of commonly applied predictors.
Improvements in predictive accuracy of targeted RF relative to ordinary RF are considerable, up to 21%, occurring both in recessions and expansions, particularly at long horizons.
Improvements in predictive accuracy of targeted RF relative to ordinary RF are considerable, up to 21%, occurring both in recessions and expansions, particularly at long horizons.
Original language | English |
---|---|
Journal | International Journal of Forecasting |
Volume | 39 |
Issue | 2 |
Pages (from-to) | 841-868 |
Number of pages | 28 |
ISSN | 0169-2070 |
DOIs | |
Publication status | Published - Apr 2023 |
Keywords
- Random forests
- Targeted predictors
- High-dimensional forecasting
- Weak predictors
- Variable selection