2011
DOI: 10.1007/s10489-011-0295-y
|View full text |Cite
|
Sign up to set email alerts
|

A two-leveled symbiotic evolutionary algorithm for clustering problems

Abstract: Because of its unsupervised nature, clustering is one of the most challenging problems, considered as a NPhard grouping problem. Recently, several evolutionary algorithms (EAs) for clustering problems have been presented because of their efficiency for solving the NP-hard problems with high degree of complexity. Most previous EA-based algorithms, however, have dealt with the clustering problems given the number of clusters (K) in advance. Although some researchers have suggested the EA-based algorithms for unk… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
3
0
1

Year Published

2012
2012
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(4 citation statements)
references
References 43 publications
0
3
0
1
Order By: Relevance
“…Literature [22], [23], [24], [25], [26], [27] propose that deep belief networks can be used to downscale and classify data, and that input data can be progressively feature extracted from low to high levels to improve the accuracy of classification. The literature [28], [29], [30] raised a greedy unsupervised training method [31], [32], [33], [34] which divides the learning process of a DBN into two steps, unsupervised learning by extracting input information layer by layer and supervised learning by fine-tuning the entire network with fixed labels, solving the problem of the difficulty of learning multiple implicit layer parameters. Some literature has conducted more in-depth research on both deep confidence network-based methods and deep learning methods based on active learning [35], [36] and proposed a series of deep confidence network-based classification methods to improve classification performance.…”
Section: Introductionmentioning
confidence: 99%
“…Literature [22], [23], [24], [25], [26], [27] propose that deep belief networks can be used to downscale and classify data, and that input data can be progressively feature extracted from low to high levels to improve the accuracy of classification. The literature [28], [29], [30] raised a greedy unsupervised training method [31], [32], [33], [34] which divides the learning process of a DBN into two steps, unsupervised learning by extracting input information layer by layer and supervised learning by fine-tuning the entire network with fixed labels, solving the problem of the difficulty of learning multiple implicit layer parameters. Some literature has conducted more in-depth research on both deep confidence network-based methods and deep learning methods based on active learning [35], [36] and proposed a series of deep confidence network-based classification methods to improve classification performance.…”
Section: Introductionmentioning
confidence: 99%
“…Garcı´a et al (2006) presented a parallel scatter search meta-heuristic algorithm in which a set of solutions evolves based on the mechanisms of combination between solutions. JirapechUmpai and Aitke (2005) applied an EA to microarray classification in order to search for the optimal or nearoptimal set of predictive genes, while Shin et al (2012) proposed a two-level symbiotic EA for high-dimensional clustering problems. In addition, Oh et al (2004) proposed a hybrid GA for feature selection, which embeds an SFFS local search operation into the conventional GA, leading to significant improvement in the final performance.…”
Section: Introductionmentioning
confidence: 99%
“…EVALUATE new candidates; 5. SELECT individuals for the next generation; UNTIL (Termination Condition is satisfied) END The limitations of pure meta-heuristic algorithms such as EA and GA have been uncovered in many applications (Oh et al, 2004;Shin et al, 2012). An effective way to overcome a drawback of conventional GA, or fine-tuning near local optimum points with long running times, is to hybridize GA by embedding the SFFS local search operations (Oh et al, 2004).…”
Section: Introductionmentioning
confidence: 99%
“…Користе се и у процесу кластеровања података, као и за естимацију густине вероватноћа. Процедура кластеровања представља једну од најзначајнијих компоненти било ког задатка препознавања [4]- [6]. Хијерархијско кластеровање модела Гаусових смеша [7] [8] (Hierarchical Clustering of Gaussian Mixture Models -HGMMC) кључна је компонента метода селекције гаусијана [9]- [12], који се примењује за повећање брзине система за препознавање говора (говорника или емоција) базираном на моделима Гаусових смеша, и то посебно у апликацијама за континуално препознавање говора (Continuous Speech Recognition -CSR), верификацију гласом [3] и адаптацију на говорника [13].…”
Section: поглављеunclassified