2020
DOI: 10.1109/tfuzz.2019.2911492
|View full text |Cite
|
Sign up to set email alerts
|

Reinforced Fuzzy Clustering-Based Ensemble Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 25 publications
(10 citation statements)
references
References 42 publications
0
10
0
Order By: Relevance
“…Within the between cluster (Vb) variation, this variation is utilized to see the comes about of the change of information conveyance between clusters. The more prominent the Vb value, the way better the cluster (Gan, 2019) (Kim et al, 2020). To see the variations of all clusters, it can be seen at the V esteem, the littler the V value, the superior the cluster value.…”
Section: Ii3 Cluster Testingmentioning
confidence: 96%
“…Within the between cluster (Vb) variation, this variation is utilized to see the comes about of the change of information conveyance between clusters. The more prominent the Vb value, the way better the cluster (Gan, 2019) (Kim et al, 2020). To see the variations of all clusters, it can be seen at the V esteem, the littler the V value, the superior the cluster value.…”
Section: Ii3 Cluster Testingmentioning
confidence: 96%
“…Another ensemble method decomposes the input space into Voronoi partitions (i.e., nonintersecting, convex partitions whose union is the input space) and associates a distinct sub-network for each partition. In [24], fuzzy C-means clustering is applied to partition the input space. In [25], a Boltzmann parameter has been used to reduce the degree of randomness of the genetic algorithm operators used in training.…”
Section: Deep Neural Network Ensemble Approachmentioning
confidence: 99%
“…The softmax output neuron functions as a switch, selecting the appropriate sub-network for input x i , which is hereafter referred as the 'winner' sub-network. The intermediate values of T have the effect of Voronoi partitioning X with 'fuzzy' boundaries as in [24]. At the end of each training epoch, the Boltzmann parameter is geometrically decreased by a factor γ ∈ (0, 1) (referred to as the 'cooling' rate) such that,…”
Section: Partitioning Using Unsupervised Learningmentioning
confidence: 99%
“…Term Weighting aims to Identify the value or heft of a term based on the level of interest in the document. Period Frequency and Inverse Document Frequency (TF-IDF) is a weighting that is often used in information retrieval and text mining [19] [20]. TF is the frequency of the case of a term in the document.…”
Section: B Text Preprocessingmentioning
confidence: 99%