Statistical Decision Theory and Related Topics IV 1988
DOI: 10.1007/978-1-4612-3818-8_32
|View full text |Cite
|
Sign up to set email alerts
|

Spatial Designs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0
1

Year Published

2000
2000
2018
2018

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 44 publications
(28 citation statements)
references
References 8 publications
0
27
0
1
Order By: Relevance
“…Direct optimization can be performed using classical optimal design tools, such as simulated annealing (Sacks and Schiller, 1988;Morris and Mitchell, 1995), genetic algorithms (Hamada et al, 2001) or subset selection (Gramacy and Lee, 2009). However, when r × d is large, finding a good trade-off between exploration and local convergence is challenging.…”
Section: Nuclear Safety Test Casementioning
confidence: 99%
“…Direct optimization can be performed using classical optimal design tools, such as simulated annealing (Sacks and Schiller, 1988;Morris and Mitchell, 1995), genetic algorithms (Hamada et al, 2001) or subset selection (Gramacy and Lee, 2009). However, when r × d is large, finding a good trade-off between exploration and local convergence is challenging.…”
Section: Nuclear Safety Test Casementioning
confidence: 99%
“…In the case of (3) our robust optimality criterion is then analogous to the classical notion of I-optimality. In a similar vein, Sacks and Schiller (1988) considered the construction of designs, assuming that f 0 ðÁÞ and g 0 ðÁ; ÁÞ were correctly specified and that E XðtÞ ½ was known and 0. They used the loss function max t E½ða T t y À XðtÞÞ 2 , and remarked upon the lack of robustness, to changes in g 0 , of their procedures.…”
Section: Introductionmentioning
confidence: 99%
“…Several classic optimization techniques have already been employed to solve similar problems for optimal designs, for example simulated annealing [34], genetic algorithms [22], or treed optimization [20]. In our case such global approaches lead to a m × d dimensional problem and, since we do not rely on analytical gradients, the full optimization would be very slow.…”
Section: Algorithmmentioning
confidence: 99%