2016 IEEE Congress on Evolutionary Computation (CEC) 2016
DOI: 10.1109/cec.2016.7744260
|View full text |Cite
|
Sign up to set email alerts
|

Large scale continuous EDA using mutual information

Abstract: Where a licence is displayed above, please note the terms and conditions of the licence govern your use of this document. When citing, please reference the published version. Take down policy While the University of Birmingham exercises care and attention in making items available there are rare occasions when an item has been uploaded in error or has been deemed to be commercially or otherwise sensitive.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 12 publications
0
5
0
Order By: Relevance
“…It concurrently evolves all the variable subsets with different probabilistic models and shows superior performance over traditional EDAs and some other efficient algorithms on a set of benchmark functions. Xu et al [44] further improved EDA-MCC by replacing the linear correlation coefficient with mutual information such that the nonlinear dependencies among variables can be detected. Yang et al [18] proposed a self-evaluation evolution (SEE) algorithm by combing the DC strategy with the surrogate model technique.…”
Section: Literature Review 21 Dc-based Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…It concurrently evolves all the variable subsets with different probabilistic models and shows superior performance over traditional EDAs and some other efficient algorithms on a set of benchmark functions. Xu et al [44] further improved EDA-MCC by replacing the linear correlation coefficient with mutual information such that the nonlinear dependencies among variables can be detected. Yang et al [18] proposed a self-evaluation evolution (SEE) algorithm by combing the DC strategy with the surrogate model technique.…”
Section: Literature Review 21 Dc-based Methodsmentioning
confidence: 99%
“…In this subsection, the performance of EDC was assessed on 14 test functions from CEC'2005 test suite. As EDC employs GSM-GEDA as the optimizer, three EDA-based algorithms for large-scale optimization, including EDA-MCC [43], EDA-MCC-MI [44] and RP-EDA [49], were selected for comparison. And the canonical GSM-GEDA was also included in this experiment to serve as a baseline.…”
Section: Experiments On Cec'2005 Test Suitementioning
confidence: 99%
See 1 more Smart Citation
“…Dong et al [78] propose an EDA with model complexity control which applies a threshold to the global Pearson correlation matrix to find weakly correlated dimensions and models them with univariate distributions and partitions the remaining strongly correlated variables into a set of lower dimensional spaces each of which is modeled using a multivariate distribution. To alleviate the deficiencies of Pearson correlation with non-Gaussian samples, Xu et al [79] proposed to use mutual information instead to detect variable dependence. A major downside of using mutual information however is its high computational cost [22].…”
Section: Estimation Of Distribution Algorithmsmentioning
confidence: 99%
“…Along this research direction, some researchers suggested taking all the variables as random ones and dividing them according to the correlation coefficient between each pair of them [25], [26]. Xu et al [27] indicated that the commonly-used Pearson correlation coefficient cannot properly depict the nonlinear variable interdependencies, and replaced it with mutual information. Recently, Yang et al [28] proposed a new decomposition algorithm named affinity propagation assisted and evolution consistency based decomposition (APEC).…”
mentioning
confidence: 99%