Hodrick–Prescott filter
The Hodrick–Prescott filter is a mathematical tool used in macroeconomics, especially in real business cycle theory, to remove the cyclical component of a time series from raw data. It is used to obtain a smoothed-curve representation of a time series, one that is more sensitive to long-term than to short-term fluctuations. The adjustment of the sensitivity of the trend to short-term fluctuations is achieved by modifying a multiplier.
The filter was popularized in the field of economics in the 1990s by economists Robert J. Hodrick and Nobel Memorial Prize winner Edward C. Prescott, though it was first proposed much earlier by E. T. Whittaker in 1923., see Whittaker-Henderson smoothing. The Hodrick–Prescott filter is a special case of a smoothing spline.
The equation
The reasoning for the methodology uses ideas related to the decomposition of time series. Let for denote the logarithms of a time series variable. The series is made up of a trend component and a cyclical component such that. Given an adequately chosen, positive value of, there is a trend component that will solveThe first term of the equation is the sum of the squared deviations, which penalizes the cyclical component. The second term is a multiple of the sum of the squares of the trend component's second differences. This second term penalizes variations in the growth rate of the trend component. The larger the value of, the higher is the penalty. Hodrick and Prescott suggest 1600 as a value for for quarterly data. Ravn and Uhlig state that should vary by the fourth power of the frequency observation ratio; thus, should equal 6.25 for annual data and 129,600 for monthly data;
in practice, for yearly data and for monthly data are commonly used, however.
The Hodrick–Prescott filter is explicitly given by
where denotes the lag operator, as can be seen from the first-order condition for the minimization problem.
Drawbacks to the Hodrick–Prescott filter
The Hodrick–Prescott filter will only be optimal when:- Data exists in a I trend.
- *If one-time permanent shocks or split growth rates occur, the filter will generate shifts in the trend that do not actually exist.
- Noise in data is approximately normally distributed.
- Analysis is purely historical and static. The filter causes misleading predictions when used dynamically since the algorithm changes the past state of the time series to adjust for the current state regardless of the size of used.
Exact algebraic formulas are available for the two-sided Hodrick–Prescott filter in terms of its signal-to-noise ratio.
A working paper by James D. Hamilton at UC San Diego titled "Why You Should Never Use the Hodrick-Prescott Filter" presents evidence against using the HP filter. Hamilton writes that:
- The HP filter produces series with spurious dynamic relations that have no basis in the underlying data-generating process.
- A one-sided version of the filter reduces but does not eliminate spurious predictability and moreover produces series that do not have the properties sought by most potential users of the HP filter.
- A statistical formalization of the problem typically produces values for the smoothing parameter vastly at odds with common practice, e.g., a value for λ far below 1600 for quarterly data.
- There's a better alternative. A regression of the variable at date t+h on the four most recent values as of date t offers a robust approach to detrending that achieves all the objectives sought by users of the HP filter with none of its drawbacks."