The 2010 International Joint Conference on Neural Networks (IJCNN) 2010
DOI: 10.1109/ijcnn.2010.5596636
|View full text |Cite
|
Sign up to set email alerts
|

Naive Support Vector Regression and Multilayer Perceptron benchmarks for the 2010 neural network grand competition (NNGC) on time series prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 32 publications
0
7
0
Order By: Relevance
“…These models have been used successfully for modelling a broad range of time-series data [ 36 , 38 , 41 , 42 ]. The task of the artificial neural network is to model the underlying data-generating process during training so that valid forecasts can be made when the parameterized model is subsequently presented with new input data [ 35 ]. The most widely used and often preferred models when building artificial neural network forecasting models are those with a Multilayer Perceptron architecture, given its computational efficiency and efficacy and its ability to be extended to deep learning ( Figure 1 ) [ 1 , 35 , 36 , 38 , 41 , 43 , 44 ].…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…These models have been used successfully for modelling a broad range of time-series data [ 36 , 38 , 41 , 42 ]. The task of the artificial neural network is to model the underlying data-generating process during training so that valid forecasts can be made when the parameterized model is subsequently presented with new input data [ 35 ]. The most widely used and often preferred models when building artificial neural network forecasting models are those with a Multilayer Perceptron architecture, given its computational efficiency and efficacy and its ability to be extended to deep learning ( Figure 1 ) [ 1 , 35 , 36 , 38 , 41 , 43 , 44 ].…”
Section: Methodsmentioning
confidence: 99%
“…The task of the artificial neural network is to model the underlying data-generating process during training so that valid forecasts can be made when the parameterized model is subsequently presented with new input data [ 35 ]. The most widely used and often preferred models when building artificial neural network forecasting models are those with a Multilayer Perceptron architecture, given its computational efficiency and efficacy and its ability to be extended to deep learning ( Figure 1 ) [ 1 , 35 , 36 , 38 , 41 , 43 , 44 ]. Mathematically, a basic artificial neural network model (NNET) can be represented as follows: in which there are two critical hyperparameters that need to be chosen, the embedding dimension, m , which captures the autocorrelation structure of the time series, and the number of hidden units, D [ 35 , 41 , 43 ].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Therefore, we present three novel 1 CI approaches for multi-step seasonal TSF: the Automatic Design of Artificial Neural Networks (ADANN), which uses genetic algorithms to evolve ANN structures; the SVM with time lag selection based on a sensitivity analysis procedure; and the linguistic fuzzy approach to the trend-cycle analysis and forecasts. The first two methods from different perspectives focus on feature and model selection process for CI methods that is often omitted [20]. The latter method focuses on the interpretability issue of fuzzy models.…”
Section: Introductionmentioning
confidence: 99%