2016
DOI: 10.1016/j.euroecorev.2015.07.013
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic model averaging in large model spaces using dynamic Occam׳s window

Abstract: Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
32
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 41 publications
(32 citation statements)
references
References 34 publications
(54 reference statements)
0
32
0
Order By: Relevance
“…However, handling larger number of combinations can quickly become very cumbersome and impose technical limits on the software at hand, especially with regards to memory consumption, see for example, Koop and Korobilis (2012). In order to deal with this issue, Onorante and Raftery (2016) suggest a strategy that considers not the whole model space, but rather a subset of models and dynamically optimizes the choice of models at each point in time. However, Onorante and Raftery (2016) have to assume that models do not change too fast over time, which is not an ideal assumption when dealing with financial and in some cases monthly economic data.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, handling larger number of combinations can quickly become very cumbersome and impose technical limits on the software at hand, especially with regards to memory consumption, see for example, Koop and Korobilis (2012). In order to deal with this issue, Onorante and Raftery (2016) suggest a strategy that considers not the whole model space, but rather a subset of models and dynamically optimizes the choice of models at each point in time. However, Onorante and Raftery (2016) have to assume that models do not change too fast over time, which is not an ideal assumption when dealing with financial and in some cases monthly economic data.…”
Section: Introductionmentioning
confidence: 99%
“…In order to deal with this issue, Onorante and Raftery (2016) suggest a strategy that considers not the whole model space, but rather a subset of models and dynamically optimizes the choice of models at each point in time. However, Onorante and Raftery (2016) have to assume that models do not change too fast over time, which is not an ideal assumption when dealing with financial and in some cases monthly economic data. Furthermore, it is not clear to us how one can incorporate the modifications suggested in Dangl and Halling (2012) within the framework of Onorante and Raftery (2016).…”
Section: Introductionmentioning
confidence: 99%
“…We can think of πpMq as a posterior distribution over M obtained by assuming a flat prior over M. Thus, the πpMq's can be considered to be the Bayes model weights, and, in the sequal, these weights will be used to perform model averaging using Occam's window. This methodology originates in [36], and has been developed in numerous other contexts, e.g., dynamic linear models [44] and graphical models [35]. For a more in depth discussion, see Supplementary Materials, Section B.2.…”
Section: Loglinear Model Selection Methodsmentioning
confidence: 99%
“…Both papers forecast inflation in the US. Onorante and Raftery (2016) nowcast GDP for the Euro area, employing 30 predictors. The curse of model space dimensionality is circumvented by introducing stochastic search over the space of the models.…”
Section: Pseudo Out-of-sample Forecastingmentioning
confidence: 99%