Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2016
DOI: 10.1007/s00158-016-1441-2
|View full text |Cite
|
Sign up to set email alerts
|

Investigation on parallel algorithms in efficient global optimization based on multiple points infill criterion and domain decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 31 publications
(14 citation statements)
references
References 30 publications
0
14
0
Order By: Relevance
“…The LS-SVM binary classifier is finally extended to a probabilistic classification, relating h in (17) to the probability of the class C + , denoted P(C + |x). A comparison of several probability models for the LS-SVM classification is provided in [36].…”
Section: Classification Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The LS-SVM binary classifier is finally extended to a probabilistic classification, relating h in (17) to the probability of the class C + , denoted P(C + |x). A comparison of several probability models for the LS-SVM classification is provided in [36].…”
Section: Classification Methodsmentioning
confidence: 99%
“…To do this, a first optimization problem is solved for the parameters (γ,λ ) (line 3). Then, the system (16) is solved, at line 4, to determine (α n ,b) in (17). Finally, the coefficients A and B in (20) are computed by Newton's method [38].…”
Section: Ego-ls-svm Methods 231 Extended Merit Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…They compared several approaches, and concluded that a multi-objective treatment of the objective and constraints (also known as the filter method) works best. Zhu et al (2015) extended Kriging Believer to robust design with expensive constraints Li et al (2016) proposed a method of adding multiple points by combining maximization of EI and minimization of mutual information between the points to be added.…”
Section: Ego -Expected Improvementmentioning
confidence: 99%
“…The weak point of the multiple additional sampling EI is it requires more experiments or the expensive data than the single additional sampling [17] because it can use less data to predict a function with the surrogate model. The target to increase the efficiency of the multiple additional sampling points by EGO is to increase the accuracy of the surrogate model.…”
Section: Introductionmentioning
confidence: 99%