2020
DOI: 10.48550/arxiv.2010.13061
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Recurrent Conditional Heteroskedasticity

Abstract: We propose a new class of financial volatility models, which we call the REcurrent Conditional Heteroskedastic (RECH) models, to improve both the in-sample analysis and out-of-sample forecast performance of the traditional conditional heteroskedastic models. In particular, we incorporate auxiliary deterministic processes, governed by recurrent neural networks, into the conditional variance of the traditional conditional heteroskedastic models, e.g. the GARCH-type models, to flexibly capture the dynamics of the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 35 publications
0
3
0
Order By: Relevance
“…Conditional heteroskedastic models, such as GARCH of Bollerslev (1986), represent σ 2 t as a deterministic function of the observations and conditional variances in the previous time steps. Nguyen et al (2020) recently propose a new class of conditional heteroskedastic models, namely the REcurrent Conditional Heteroskedastic (RECH) models, by combining recurrent neural networks (RNNs) and GARCH-type models, for flexible modelling of the volatility dynamics. The conditional variance in the RECH models is the sum of two components: the recurrent component modeled by an RNN, and the garch component modeled by a GARCH-type structure.…”
Section: Bayesian Deep Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Conditional heteroskedastic models, such as GARCH of Bollerslev (1986), represent σ 2 t as a deterministic function of the observations and conditional variances in the previous time steps. Nguyen et al (2020) recently propose a new class of conditional heteroskedastic models, namely the REcurrent Conditional Heteroskedastic (RECH) models, by combining recurrent neural networks (RNNs) and GARCH-type models, for flexible modelling of the volatility dynamics. The conditional variance in the RECH models is the sum of two components: the recurrent component modeled by an RNN, and the garch component modeled by a GARCH-type structure.…”
Section: Bayesian Deep Neural Networkmentioning
confidence: 99%
“…The distribution parameters must be stored in a Matlab 1D array. In this example, we use the same priors as suggested in Nguyen et al (2020) Similar to the other VB algorithm classes, the MGVB class stores the outputs in a Matlab structure which can be used as shown in the following code to visualize the density of variational distribution and smoothed lower bound. The following code defines a function that computes both h(θ) and ∇ θ h(θ) as in ( 27)-( 28 % Compute gradient of the h(theta) h func grad = llh grad + log prior grad; % h func grad must be a column h func grad = reshape(h func grad,length(h func grad),1); end There are some rules to define a proper function for calculating h(θ) and ∇h(θ) that it is compatible with the VB algorithm classes in the package.…”
Section: Bayesian Deep Neural Networkmentioning
confidence: 99%
“…Recently, several studies [11,12] have explored the heteroskedasticity of returns with recurrent NN architectures, but not by means of modified LSTM cell. In [11], authors have proposed the RECH model, where ω-constant of the GARCH process is modeled by a particular RNN model.…”
Section: Introductionmentioning
confidence: 99%