2013 10th IEEE International Conference on Control and Automation (ICCA) 2013
DOI: 10.1109/icca.2013.6564876
|View full text |Cite
|
Sign up to set email alerts
|

Growing-type WASD for power-activation neuronet to model and forecast monthly time series

Abstract: In this paper, a novel WASD (weights and structure determination) algorithm is presented for the poweractivation feed-forward neuronet (PFN) to solve monthly time series modeling and forecasting problems. Besides, a simple and effective data preprocessing approach is employed. Based on the WDD (weights direct determination) method and the relationship between the structure and the performance of PFN, the WASD algorithm can determine the weights and the optimal structure (i.e., the optimal numbers of input-laye… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…Note that we choose the level of the time series L t (i.e., the average value of the time series) as R t throughout the paper; and that the WASD (weights and structure determination) neuronet [11] will be exploited to estimate R t in the our further work for higher accuracy. In the rest of this section, the IHTPES, MSFD, and WCC methods are investigated for trend estimation, seasonality estimation, and global time series estimation, respectively.…”
Section: Methodology Descriptionmentioning
confidence: 99%
“…Note that we choose the level of the time series L t (i.e., the average value of the time series) as R t throughout the paper; and that the WASD (weights and structure determination) neuronet [11] will be exploited to estimate R t in the our further work for higher accuracy. In the rest of this section, the IHTPES, MSFD, and WCC methods are investigated for trend estimation, seasonality estimation, and global time series estimation, respectively.…”
Section: Methodology Descriptionmentioning
confidence: 99%
“…Because we are dealing with time‐series, an approach similar to the neuronets in Zhang et al ((2013), (2019)), which is based on the K‐order Taylor series, is employed. Considering that yt,yt1,yt2,,ytM is a time‐series, where yt1,yt2,,ytM corresponds to the neuronets variables X1,X2,,XM, respectively, and yt corresponds to the neuronet's output target Y, the neuronet model performs a nonlinear functional mapping from the past observations to the future value yt.…”
Section: The Mi‐wasdtsn Modelmentioning
confidence: 99%
“…Furthermore, several activation functions, such as power, signum, sine and square wave, are employed on neuronets in Zhang et al ((2013), (2019)) and Zeng et al (2020). In here, our approach is based on the sigmoid function since it occurs between two points (0 to 1).…”
Section: The Mi‐wasdtsn Modelmentioning
confidence: 99%
“…mance of the NN. In previous researches, activation functions are almost all continuous functions, such as power functions [11], Gaussian functions [12] and Chebyshev polynomials [13]. NNs activated by these functions are shown to have the ability to achieve the approximation of nonlinear continuous functions effectively.…”
Section: Introductionmentioning
confidence: 99%