2020 International Symposium on Community-Centric Systems (CcS) 2020
DOI: 10.1109/ccs49175.2020.9231474
|View full text |Cite
|
Sign up to set email alerts
|

Divisive Hierarchical Clustering Based on Adaptive Resonance Theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(16 citation statements)
references
References 15 publications
0
15
0
1
Order By: Relevance
“…Break, go to step-1 (11) else (12) If density > crowed factor and better food consistency (13) Next move ⟵ Sigmoid (Prey Move) ( 14) else (15) Next Move ⟵ Random (Sigmoid (Swarm Move or Follow Move)) ( 16) end ( 17) end ( 18) final result ⟶ apply modularity (19) End Algorithm ALGORITHM 1: Sigmoid Fish Swarm Optimization (SiFSO) algorithm. 4 Complexity…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Break, go to step-1 (11) else (12) If density > crowed factor and better food consistency (13) Next move ⟵ Sigmoid (Prey Move) ( 14) else (15) Next Move ⟵ Random (Sigmoid (Swarm Move or Follow Move)) ( 16) end ( 17) end ( 18) final result ⟶ apply modularity (19) End Algorithm ALGORITHM 1: Sigmoid Fish Swarm Optimization (SiFSO) algorithm. 4 Complexity…”
Section: Methodsmentioning
confidence: 99%
“…In the hierarchical-based approach, the network is divided into several hierarchies representing different network parts at each level. Hierarchical clustering techniques can be further divided into two classes, i.e., divisive algorithms [12,20,21] and agglomerative algorithms [18,22,23]. In the divisive method, the network is split into two subgraphs and the process is continued until the clear mark of the cluster is removed.…”
Section: Related Workmentioning
confidence: 99%
“…This section presents quantitative comparisons focusing on the information extraction performance of CAEA, HCAEA, GHNG [2], GH-EXIN [30], and HFTCA [14] based on the classification performance. In general, the evaluation of the clustering performance is subjective if a dataset does not have label information (i.e., each pattern in the dataset has no class label).…”
Section: Simulation Experimentsmentioning
confidence: 99%
“…In our previous study [14] inspired by GHNG [12], we have introduced Hierarchical FTCA (HFTCA) to improve the clustering performance of FTCA [12]. Although HFTCA shows superior clustering performance than GHSOM and GHNG, HFTCA has a parameter (i.e., a similarity threshold) that significantly affects its clustering performance.…”
Section: Introductionmentioning
confidence: 99%
“…n = 5, 000 (b) n = 10, 000 (c) n = 15, 000 (d) n = 45, 000 (e) n 90, 000 次元人工データ(図 2(a))に対する FTCA のトポロジカルネットワーク生成過程 (a) n = 5, 000 (b) n = 10, 000 (c) n = 15, 000 (d) n = 45, 000 (e) n = 90, 000 図 4 2 次元人工データ(図 2(a))に対する ASOINN のトポロジカルネットワーク生成過程 (a) Entire datasets(10% noise) (b) FTCA (c) ASOINN (d) CA 図 5 FTCA,ASOINN,および CA のクラスタリング結果 15,000,45,000,および 90,000 個入力された時点での FTCA と ASOINN のトポロジカルネットワークを示す.図 4 より, a) のデータセットに対して FTCA と ASOINN を 適用する.図 5(a) は,図 2(a) の各分布から 10% のデータをラ ンダムに選択し,一様乱数に置き換えたデータセットである. 図 5 に FTCA,ASOINN,および参考として CA のクラス タリング結果を示す.CA はノイズ除去機構を持たないため, ノイズデータに対してもノードを生成していることがわかる. ASOINN は各分布の形状を把握することが可能だが,ノイズ a) に示すデータセットは,図 2(a) の各分布から 45% の データをランダムに選択し,一様乱数に置き換えたものであ る.表 5 に MFTCA のパラメータ設定を示す.図 6 に 4 層の に階層構造を導入した Hierarchical FTCA (HFTCA)[19] …”
unclassified