Tomorrow is the 11th ECB Forecasting Conference.
I am excited to see so many top authors: Sims, Engle, Koop, Marcellino, Schorfeide, and many more.
It is fitting that so many progenitors of the innovative models of yesteryear and workhorse methods of today — VAR, ARCH, and Bayesian macroeconometrics — are here to oversee the next generation, who are forging tomorrow’s method in the fires of machine learning.
One such youngster is Phillippe Goulet Coloumbe, author of Taste #3: “The Macroeconomy As A Random Forest.”
In short, his method uses the usual splitting structure of trees over bootstrapped samples. However, regularized linear equations (e.g. a ridge AR(1)) appears in each terminal node as opposed to the vanilla conditional sample mean. By allowing the regressors — or some super/sub-set of regressors — to dictate the splits of trees, we can capture a myriad of non-linearities. For example, a split on a trend variable can capture the behavior of sharp structural change, and a split on the lagged dependent variable can capture the behavior of regime-switching. Averaging over many such trees incorporates all of these dynamics, approximating a true underlying non-linear structure. In short, it is a clever way to use a common machine learning algorithm to capture common time series dynamics, all in one package.
Given the recent chatter about a labor shortage and upward pressure on wages, I have been interested in forecasting wage inflation via the Employment Cost Index (Wages & Salaries) year-over-year.
This exercise reflects my personal interest, and does not represent the opinion of Cornerstone Macro.
I conduct a pseudo-out-of-sample experiment over the last 20 quarters. For each quarter, I fit a factor-augmented VAR (OLS) and factor-augmented MRF, then forecast ECI one-quarter ahead.
OLS Mean Absolute Error = 0.12
MRF Mean Absolute Error = 0.07
Diebold Mariano-Test, p = 0.003
This is evidence in support of MRF’s forecasting capabilities.