2007
DOI: 10.1016/j.csda.2006.03.003
|View full text |Cite
|
Sign up to set email alerts
|

Forecasting nonlinear time series with neural network sieve bootstrap

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2008
2008
2024
2024

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 59 publications
(22 citation statements)
references
References 19 publications
0
19
0
Order By: Relevance
“…The basic idea of this bootstrap method is to approximate the error process by an AR model of order increasing with the sample size. The sieve bootstrap has been successfully employed to test for an autoregressive unit root (Psaradakis, 2001(Psaradakis, , 2003Chang, 2004), to resample from cointegrating regressions (Chang et al, 2006), to conduct inference with VAR models (Inoue and Kilian, 2002a) and to construct prediction intervals for nonlinear time series using neural networks (Giordano et al, 2007). We develop two resampling algorithms that build both on the …xed regressor bootstrap of Hansen (2000) and on the restricted residuals approach of Nankervis and Savin (1996).…”
Section: Introductionmentioning
confidence: 99%
“…The basic idea of this bootstrap method is to approximate the error process by an AR model of order increasing with the sample size. The sieve bootstrap has been successfully employed to test for an autoregressive unit root (Psaradakis, 2001(Psaradakis, , 2003Chang, 2004), to resample from cointegrating regressions (Chang et al, 2006), to conduct inference with VAR models (Inoue and Kilian, 2002a) and to construct prediction intervals for nonlinear time series using neural networks (Giordano et al, 2007). We develop two resampling algorithms that build both on the …xed regressor bootstrap of Hansen (2000) and on the restricted residuals approach of Nankervis and Savin (1996).…”
Section: Introductionmentioning
confidence: 99%
“…One of these areas is time series prediction [6][7][8][9][10][11][12][13][14][15][16]. The major advantage of neural networks is their flexible capability in nonlinear modeling [17].…”
Section: Introductionmentioning
confidence: 99%
“…However, such an assumption usually failed to be well satisfied given with the poor generalization ability of generic neural network due to the uncertainty of practical data. In addition, a network ensemble that combined a number of neural networks was proposed based on the bootstrap technique presented in (Lee et al 2011;Shrivastava and Panigrahi 2013;Giordano et al 2007), which was claimed to be more stable than those of single networks. While the modeling complexity of the network ensemble probably lead to higher computational cost compared to that of a single network and an uncertain modeling quality ).…”
Section: Introductionmentioning
confidence: 99%