Inducing Sparsity and Shrinkage in Models with Time-Varying Parameters

Reading time: 4 minute
...

📝 Original Paper Info

- Title: Inducing Sparsity and Shrinkage in Time-Varying Parameter Models
- ArXiv ID: 1905.10787
- Date: 2019-12-18
- Authors: Florian Huber, Gary Koop, Luca Onorante

📝 Abstract

Time-varying parameter (TVP) models have the potential to be over-parameterized, particularly when the number of variables in the model is large. Global-local priors are increasingly used to induce shrinkage in such models. But the estimates produced by these priors can still have appreciable uncertainty. Sparsification has the potential to reduce this uncertainty and improve forecasts. In this paper, we develop computationally simple methods which both shrink and sparsify TVP models. In a simulated data exercise we show the benefits of our shrink-then-sparsify approach in a variety of sparse and dense TVP regressions. In a macroeconomic forecasting exercise, we find our approach to substantially improve forecast performance relative to shrinkage alone.

💡 Summary & Analysis

This paper addresses the issue of over-parameterization in time-varying parameter (TVP) models, a common problem when the number of variables in the model is large. Over-parameterization can lead to complex models with significant estimation uncertainty. While global-local priors are commonly used to induce shrinkage and reduce complexity, they still leave considerable uncertainty in estimates. The authors propose an approach that combines both shrinkage and sparsification to improve model performance and prediction accuracy.

The core of the paper involves a method for applying shrink-then-sparsify techniques to TVP models in a computationally efficient manner. This method simplifies the model by removing unnecessary variables, thereby reducing estimation uncertainty and enhancing predictive power. The effectiveness of this approach is demonstrated through both simulated data exercises and macroeconomic forecasting experiments.

The results show that combining shrinkage with sparsification significantly improves forecast performance over using shrinkage alone in various sparse and dense TVP regressions. This improvement highlights the potential benefits of integrating both techniques to achieve better model efficiency and prediction accuracy.

This research has significant implications for fields requiring accurate economic predictions, such as financial analysis and macroeconomic forecasting. By simplifying models while improving their predictive capabilities, this approach could lead to more reliable forecasts in various applications.

📄 Full Paper Content (ArXiv Source)

| | DL | LASSO | NG | horse | SSVS | DL | LASSO | NG | horse | SSVS | |-----:|-----:|------:|-----:|------:|-----:|-----:|------:|-----:|------:|-----:| | X1 | 1.00 | 1.04 | 1.00 | 1.01 | 1.01 | 1.01 | 1.02 | 1.01 | 1.03 | 1.02 | | X2 | 1.03 | 1.06 | 1.04 | 1.04 | 1.04 | 1.06 | 1.09 | 1.07 | 1.08 | 1.06 | | X3 | 1.01 | 1.05 | 1.02 | 1.02 | 1.04 | 0.99 | 1.00 | 1.00 | 1.00 | 1.00 | | X1.1 | 0.95 | 0.97 | 0.95 | 0.95 | 0.95 | 1.02 | 1.01 | 1.02 | 1.02 | 1.03 | | X2.1 | 1.01 | 1.02 | 1.01 | 1.01 | 1.01 | 1.02 | 1.02 | 1.02 | 1.02 | 1.01 | | X3.1 | 0.92 | 0.94 | 0.91 | 0.90 | 0.94 | 0.94 | 0.91 | 0.91 | 0.92 | 0.94 |
DL LASSO NG horse SSVS DL LASSO NG horse SSVS
X1 0.72 0.61 0.64 0.59 0.72 0.46 0.42 0.46 0.38 0.57
X12 0.77 0.88 0.88 0.78 0.92 0.70 0.88 0.89 0.72 0.93
X15 0.43 0.74 0.55 0.44 0.61 0.34 0.71 0.53 0.38 0.59
X1.1 0.12 0.12 0.09 0.07 0.17 0.09 0.09 0.08 0.08 0.13
X12.1 0.15 0.18 0.22 0.16 0.18 0.15 0.17 0.21 0.15 0.16
X15.1 0.06 0.10 0.21 0.06 0.09 0.05 0.07 0.17 0.05 0.08
DL LASSO NG horse SSVS DL LASSO NG horse SSVS
X1 1.11 1.14 1.12 1.07 1.16 0.84 0.84 0.84 0.84 0.88
X6 0.96 1.00 0.97 0.94 1.00 0.97 1.01 0.98 0.99 0.99
X8 0.98 1.07 0.99 0.96 1.05 0.90 0.99 0.92 0.91 0.97
X1.1 1.04 1.03 1.05 1.00 1.06 0.93 0.91 0.93 0.93 0.94
X6.1 0.99 1.00 1.00 0.98 0.99 1.00 1.00 0.99 0.99 1.00
X8.1 0.94 1.03 0.94 0.90 1.04 0.89 0.88 0.86 0.85 0.94

📊 논문 시각자료 (Figures)

Figure 1



Figure 2



Figure 3



Figure 4



Figure 5



Figure 6



Figure 7



Figure 8



Figure 9



Figure 10



A Note of Gratitude

The copyright of this content belongs to the respective researchers. We deeply appreciate their hard work and contribution to the advancement of human civilization.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut