2010
DOI: 10.1007/s10182-010-0145-y
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Latin hypercube designs for the Kullback–Leibler criterion

Abstract: Computer experiments, Space-filling designs, Optimal Latin hypercube designs, Kullback–Leibler information,

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(13 citation statements)
references
References 18 publications
(19 reference statements)
0
12
0
Order By: Relevance
“…The adaptive construction of the neural network performs well, but it would be interesting to start with a space filling design, such as a Wootton-Sergent-Phan-Tan-Luu one (Sergent 1989), a maximin Latin Hypercube Sampling (Johnson et al 1990), or an optimal design for the promising Kullback-Leibler information criterion (Jourdan and Franco 2010), instead of the low discrepancy sequence terms. Moreover, for our chosen scenario, there are only 28 uncertain input variables and among them ten have a non-negligible impact on the IRS variability; but other configurations imply to deal with about 60 uncertain factors, and maybe up to 30 influent ones.…”
Section: Adaptive Metamodelmentioning
confidence: 99%
“…The adaptive construction of the neural network performs well, but it would be interesting to start with a space filling design, such as a Wootton-Sergent-Phan-Tan-Luu one (Sergent 1989), a maximin Latin Hypercube Sampling (Johnson et al 1990), or an optimal design for the promising Kullback-Leibler information criterion (Jourdan and Franco 2010), instead of the low discrepancy sequence terms. Moreover, for our chosen scenario, there are only 28 uncertain input variables and among them ten have a non-negligible impact on the IRS variability; but other configurations imply to deal with about 60 uncertain factors, and maybe up to 30 influent ones.…”
Section: Adaptive Metamodelmentioning
confidence: 99%
“…[17] Two kinds of discrepancy are selected, the L2-discrepancy (DL2) and the centred L2-discrepancy (DC2). • Jourdan and Franco [23] introduced the KL criterion to measure the difference between the density function of the design points and the uniform density function. This criterion (to be maximized) is given bŷ The connections between these criteria are studied in [24].…”
Section: Appendix Space-filling Criteriamentioning
confidence: 99%
“…Petelet et al (2010) propose an algorithm that constructs LHDs that are able to uphold certain inequality constraints between the sample variables. Jourdan and Franco (2010) build a new class of LHDs, using the Kullback-Leibler information as a criterion to ensure that the design gives effective coverage of the factor space.…”
Section: Latin Hypercube Designsmentioning
confidence: 99%