2017
DOI: 10.1007/978-3-319-55849-3_41
|View full text |Cite|
|
Sign up to set email alerts
|

Large Scale Problems in Practice: The Effect of Dimensionality on the Interaction Among Variables

Abstract: Abstract. This article performs a study on correlation between pairs of variables in dependence on the problem dimensionality. Two tests, based on Pearson and Spearman coefficients, have been designed and used in this work. In total, 86 test problems ranging between 10 and 1000 variables have been studied. If the most commonly used experimental conditions are used, the correlation between pairs of variables appears, from the perspective of the search algorithm, to consistently decrease. This effect is not due … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
14
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 22 publications
(16 citation statements)
references
References 48 publications
2
14
0
Order By: Relevance
“…Techniques that perturb the variables separately, just like that used in this article, are known to be effective for large scale problems, see [24,17,15]. This observation was reported in the experimental study in [2]. Large scale problems are by no means easier than low-dimensional problems.…”
Section: Motivation Of the Proposed Designmentioning
confidence: 72%
See 1 more Smart Citation
“…Techniques that perturb the variables separately, just like that used in this article, are known to be effective for large scale problems, see [24,17,15]. This observation was reported in the experimental study in [2]. Large scale problems are by no means easier than low-dimensional problems.…”
Section: Motivation Of the Proposed Designmentioning
confidence: 72%
“…Under these experimental conditions, the algorithm "sees" the problem as separable: average Pearson and Spearman coefficients of the variables approach zero independently on the problem when the dimensionality grows, see [2].…”
Section: Motivation Of the Proposed Designmentioning
confidence: 99%
“…When dealing with some particular optimization problems, the gradient information and hessian matrix are requirements for the deterministic optimization method to converge to the optimal solution [47]. Furthermore, in some scenarios [48]- [51], we find that the use of deterministic algorithms does not require gradient information. As for the swarm intelligence algorithms, it is hard to obtain the accurate hessian matrix of the objective function [52].…”
Section: B Obl and Covariancementioning
confidence: 88%
“…Quite surprisingly though, recent evidence (Caraffini et al 2012) has shown that even non-separable functions can be handled by perturbing separately each variable at a time: while this approach does not necessarily lead to the detection of the optimum, it may still be able to detect promising areas of the decision space. In the same context, another study has shown that the correlation between pairs of variables appears to consistently decrease when the problem dimensionality increases (Caraffini et al 2017). In other words, non-separable problems in high dimensionalities can be tackled as if they are separable.…”
Section: Introductionmentioning
confidence: 94%
“…Based on the aforementioned evidence on the effect of separable operators on non-separable problems (Caraffini et al 2012(Caraffini et al , 2017, and being motivated by the results concerning the non-uniform mutation (Tang and Tseng 2013) in Genetic Algorithms, in the present work we aim to address the following research questions:…”
Section: Introductionmentioning
confidence: 99%