2020 IEEE Congress on Evolutionary Computation (CEC) 2020
DOI: 10.1109/cec48606.2020.9185548
|View full text |Cite
|
Sign up to set email alerts
|

Covariance Local Search for Memetic Frameworks: A Fitness Landscape Analysis Approach

Abstract: The design of each agent composing a Memetic Algorithm (MA) is a delicate task which often requires prior knowledge of the problem to be effective. This paper proposes a method to analyse one feature of the fitness landscape, that is the epistasis, with the aim of designing efficient local search algorithms for Memetic Frameworks. The proposed Analysis of Epistasis performs a sampling of points within the basin of attraction and builds a data set containing those candidate solutions whose objective function va… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1
1
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 41 publications
0
7
0
Order By: Relevance
“…When the optimum of a multivariate function is searched, a set of candidate solutions can be interpreted as a multivariate distribution, see [4,8]. If only a distribution of points whose objective function value is below a threshold (in a minimisation problem) are saved in the data set, then this distribution describes the geometry of the optimisation problem, see [31]. Like for the case of the PCA, the diagonalisation of the associated covariance matrix, that is the detection of its eigenvectors provides the optimisation algorithm with a set of preferential search directions to perform the search for the the optimum, see [30].…”
Section: How To Keep the Full Computer Science Cohort Engaged And Intmentioning
confidence: 99%
“…When the optimum of a multivariate function is searched, a set of candidate solutions can be interpreted as a multivariate distribution, see [4,8]. If only a distribution of points whose objective function value is below a threshold (in a minimisation problem) are saved in the data set, then this distribution describes the geometry of the optimisation problem, see [31]. Like for the case of the PCA, the diagonalisation of the associated covariance matrix, that is the detection of its eigenvectors provides the optimisation algorithm with a set of preferential search directions to perform the search for the the optimum, see [30].…”
Section: How To Keep the Full Computer Science Cohort Engaged And Intmentioning
confidence: 99%
“…In [25], [28], this logic has been coupled with a FLA. After having set a problem specific threshold thre, several points are sampled within D and their objective function calculated. Those points whose objective function value is such that f (x) < thre are stored in a data structure V:…”
Section: Background: Covariance Pattern Searchmentioning
confidence: 99%
“…This concept is broadly used in other contexts, especially in Data Science, and is closely related to Principal Component Analysis [33]. Furthermore, it was shown in [28] that, if the sampling of points in V describes the geometry of the basins of attraction, the directions of the eigenvectors identify the maximum and minimum directional derivative. Numerical results in [25] and [28] show that CPS consistently outperforms the PS that uses B e .…”
Section: Algorithm 1 Covariance Pattern Searchmentioning
confidence: 99%
See 1 more Smart Citation
“…This concept is broadly used in other contexts, especially in Data Science, and is closely related to Principal Component Analysis [9]. Furthermore, it was shown in [15] that, if the sampling of points in V describes the geometry of the basins of attraction, the directions of the eigenvectors identify the maximum and minimum directional derivative. Thus, numerical results in [14] and [15] show that CPS consistently outperforms the standard pattern search.…”
mentioning
confidence: 99%