2009 IEEE Congress on Evolutionary Computation 2009
DOI: 10.1109/cec.2009.4983074
|View full text |Cite
|
Sign up to set email alerts
|

Comparing design of experiments and evolutionary approaches to multi-objective optimisation of sensornet protocols

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2009
2009
2013
2013

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…DoE has been successfully applied to control simulations of parameterized systems, for various verification purposes: screening [13], sensitivity analysis [6], robust design [1], or multi-objective optimization [12]. Previous work found DoE able to detect important effects and response extremes, for static responses in automotive applications [7], [8].…”
Section: Related Workmentioning
confidence: 99%
“…DoE has been successfully applied to control simulations of parameterized systems, for various verification purposes: screening [13], sensitivity analysis [6], robust design [1], or multi-objective optimization [12]. Previous work found DoE able to detect important effects and response extremes, for static responses in automotive applications [7], [8].…”
Section: Related Workmentioning
confidence: 99%
“…In section 3.3 we apply a similar approach with direct interpolation on the data. Alternative approaches to fill the parameter space are Factorial Designs [30] and Latin Hypercube Designs [31], which avoids redundant parameter sets and constructs a RSM with similar accuracy in less simulation runs. In such DOEs typically all parameter sets are defined before the first simulation is run.…”
Section: Design-of-experiments Studiesmentioning
confidence: 99%
“…reduce the set of factors to the decisive ones (Kleijnen, 2008;Trocine, Malone, 2001); sensitivity analysis, to offer understanding in a quantitative and predictive way about impact of factors on system performance (Nookala, Ying Chen, Sapatnekar, 2005;Srinivasaiah, Bhat, 2004); to find optimum implementations (Sheldon, Vahid, Lonardi, 2007;Tate et al, 2009); for robust design, to choose factors that ensure minimum response variability (Ayeb, Theuerkauf, Winsel, 2006). Previous work evaluates experimental designs to cover the verification space better or reduce the number of simulation points while keeping reasonable prediction models (Sanchez, 2007).…”
Section: Introductionmentioning
confidence: 99%