2021
DOI: 10.1109/tetci.2020.2998919
|View full text |Cite
|
Sign up to set email alerts
|

Consensus Learning for Distributed Fuzzy Neural Network in Big Data Environment

Abstract: Uncertainty and distributed nature inherently exist in big data environment. Distributed fuzzy neural network (D-FNN) that not only employs fuzzy logics to alleviate the uncertainty problem but also deal with data in a distributed manner, is effective and crucial for big data. Existing D-FNNs always avoided consensus for their antecedent layer due to computational difficulty. Hence such D-FNNs are not really distributed since a single model can not be agreed by multiple agents. This paper proposes a real D-FNN… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(10 citation statements)
references
References 36 publications
0
10
0
Order By: Relevance
“…Their models assume that all clients share the information in antecedent layers, making this technically not a seriously distributed method. To avoid this problem, a fully DFNN [10] model was proposed by adopting consensus learning in both the antecedent and consequent layers. As its subsequence variant, a semisupervised DFNN model [12] was presented to enable the DFNN to leverage unlabeled samples by using the fuzzy C-means method and distributed interpolation-based consistency regularization.…”
Section: Related Work a Distributed Fuzzy Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Their models assume that all clients share the information in antecedent layers, making this technically not a seriously distributed method. To avoid this problem, a fully DFNN [10] model was proposed by adopting consensus learning in both the antecedent and consequent layers. As its subsequence variant, a semisupervised DFNN model [12] was presented to enable the DFNN to leverage unlabeled samples by using the fuzzy C-means method and distributed interpolation-based consistency regularization.…”
Section: Related Work a Distributed Fuzzy Neural Networkmentioning
confidence: 99%
“…-DFNN: This is the fully DFNN algorithm proposed in [10], which adopts consensus learning in both the parameter learning and structure learning procedures and achieves state-of-the-art performance among distributed fuzzy models. As mentioned before, this model learns a consensus FNN for all clients, which limits its applications in non-IID scenarios.…”
Section: Datasetmentioning
confidence: 99%
“…Additionally, the algorithms are only applied in the consequent layers of the FNN [44], [45], which means that, strictly speaking, this decentralized FNN model is only partially distributed. A more recent proposition by Shi et al [46] involves a distributed FNN with a consensus learning strategy. A novel method of distributed clustering optimize the parameters in the antecedent layer, while a similar method of distributed parameter learning does the same for the consequent layer.…”
Section: A Distributed Learningmentioning
confidence: 99%
“…T2FL has shown better performance than T1FL (Deng et al 2020c ; He et al 2019 ; Mohammadzadeh and Kumbasar 2020a ; Shi et al 2020 ; Son et al 2020 ). T2F-NNs are divided into feedforward and recurrent, Mamdani (Linguistic) (Ayala et al 2020 ) and TSK (Tim Oliver Heinz 2017 ) and finally interval and general, in different categories.…”
Section: T2f-nnsmentioning
confidence: 99%
“…In addition to structure, the learning method is also effective in estimation performance of FNNs. Various optimization methods have been applied on the tuning of the both parameters and rules such as particle swarm optimization (Deng et al 2020a ; Kacimi et al 2020 ), quantum-inspired differential evolution (Su and Yang 2011 ; Deng et al xxxx; Deng et al 2020b ), differential evolution (Deng et al 2020c ), extreme learning approach (He et al 2019 ), fractional-order learning rules (Mohammadzadeh and Kumbasar 2020a ), consensus learning (Shi et al 2020 ).…”
Section: Introductionmentioning
confidence: 99%