2015
DOI: 10.1016/j.ins.2015.07.020
|View full text |Cite
|
Sign up to set email alerts
|

Generalized type-2 fuzzy weight adjustment for backpropagation neural networks in time series prediction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 59 publications
(20 citation statements)
references
References 52 publications
0
20
0
Order By: Relevance
“…Artificial neuron, which is a basic processing unit of the neural network [23], connects with each other and accepts a weighted input to produce a corresponding output by means of an activation function [24]. At present, ANN mostly adopts M-P model which was put forward by psychologists Mcculloch and mathematical logician Pitts.…”
Section: Artificial Neuron Modelmentioning
confidence: 99%
“…Artificial neuron, which is a basic processing unit of the neural network [23], connects with each other and accepts a weighted input to produce a corresponding output by means of an activation function [24]. At present, ANN mostly adopts M-P model which was put forward by psychologists Mcculloch and mathematical logician Pitts.…”
Section: Artificial Neuron Modelmentioning
confidence: 99%
“…The proposed method in the present research is different to other papers, such as in Gaxiola et al [7,8], where the fuzzy weights are obtained using interval type-2 fuzzy inference systems in [7] and generalized type-2 fuzzy inference systems in [8] for the connections between the layers, and without any changes for obtaining the change of the weights for each epoch of the backpropagation algorithm.…”
Section: Introductionmentioning
confidence: 93%
“…They are widely used in many applications due to their ability to learn from input-output data. The combination of FLC and ANN, called fuzzy neural network (FNN), fuses the reasoning ability of FLC to handle uncertain information with the training capability of ANN to learn from the controlled process (Gaxiola et al, 2015). FNN has shown promising results as it adopts the advantages from both FLC and ANN (Wang et al, 2015;Kim and Chwa, 2015;Gaxiola et al, 2014).…”
Section: Accepted Manuscriptmentioning
confidence: 99%