2017
DOI: 10.2991/ijcis.2017.10.1.26
|View full text |Cite
|
Sign up to set email alerts
|

Times Series Forecasting using Chebyshev Functions based Locally Recurrent neuro-Fuzzy Information System

Abstract: The model proposed in this paper, is a hybridization of fuzzy neural network (FNN) and a functional link neural system for time series data prediction. The TSK-type feedforward fuzzy neural network does not take the full advantage of the use of the fuzzy rule base in accurate input-output mapping and hence a hybrid model is developed using the Chebyshev polynomial functions to construct the consequent part of the fuzzy rules. The model to be known as locally recurrent neuro fuzzy information system (LRNFIS) is… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 54 publications
0
5
0
Order By: Relevance
“…After the function sequence is input, the time slice set S T is converted to the feature sequence set S F as follows: 1) , f e(i,2) , ..., f e(i,m) f e(i,j) represents the feature value generated by the function F j (x) in the time slice S T i . The shape of feature sequence set S F is ((n − Ws + 1), m).…”
Section: Step 3: Input Function Sequencementioning
confidence: 99%
See 1 more Smart Citation
“…After the function sequence is input, the time slice set S T is converted to the feature sequence set S F as follows: 1) , f e(i,2) , ..., f e(i,m) f e(i,j) represents the feature value generated by the function F j (x) in the time slice S T i . The shape of feature sequence set S F is ((n − Ws + 1), m).…”
Section: Step 3: Input Function Sequencementioning
confidence: 99%
“…CCI data is a time series, and there are many forecasting methods for time series. Time series forecasting methods include statistical methods, fuzzy forecasting methods [1,2,3], complex network methods [4,5,6], evidence theory methods [7], machine learning methods [8,9], and so on.…”
Section: Introductionmentioning
confidence: 99%
“…The effectiveness of the RNN in solving timing problems has been validated in many research areas and has yielded many encouraging results, such as image description [19], speech recognition [20], and machine translation [21]. In addition, RNN is also used for the prediction of events related to time series, such as stock prediction [22,23]. However, in the process of training, RNN faces the problem of vanishing gradients, which leads to the failure of the network to converge normally, and RNN cannot overcome the impact of long-term dependencies [24].…”
Section: Gated Recurrent Unit Deep Neural Network (Dnn)mentioning
confidence: 99%
“…Chiu and Chen [8] and Gonzalez et al [9] described models which use support vector machines (SVM) together with fuzzy models and genetic algorithms. Another effective approach is to exploit the representative capabilities of wavelets (for instance, Bodyanskiy et al [10], Chandar [11]) and recurrent connections as in Parida et al [12] and Atsalakis and Valavanis [13]. Cai et al [14] used ant colony optimization.…”
Section: Introductionmentioning
confidence: 99%