2016
DOI: 10.1016/j.ress.2016.07.012
|View full text |Cite
|
Sign up to set email alerts
|

Global sensitivity analysis using low-rank tensor approximations

Abstract: In the context of global sensitivity analysis, the Sobol' indices constitute a powerful tool for assessing the relative significance of the uncertain input parameters of a model. We herein introduce a novel approach for evaluating these indices at low computational cost, by postprocessing the coefficients of polynomial meta-models belonging to the class of low-rank tensor approximations. Meta-models of this class can be particularly efficient in representing responses of high-dimensional models, because the nu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
50
0
3

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 73 publications
(55 citation statements)
references
References 73 publications
0
50
0
3
Order By: Relevance
“…For instance, in samplingbased techniques, the number of simulation samples may increase exponentially as d increases. Some tensor solvers have been developed to address this fundamental challenge: [38], [39] high-dim stochastic collocation tensor completion to estimate unknown simulation data [37] hierarchical uncertainty quantification tensor-train decomposition for high-dim integration [33] uncertainty analysis with non-Gaussian correlated uncertainty functional tensor train to compute basis functions [40] spatial variation pattern prediction statistical tensor completion to predict variation pattern resulting generalized polynomial chaos expansion is sparse. This technique has been successfully applied to electronic IC, photonics and MEMS with up to 57 random parameters.…”
Section: B Tensor Methods For Uncertainty Propagationmentioning
confidence: 99%
“…For instance, in samplingbased techniques, the number of simulation samples may increase exponentially as d increases. Some tensor solvers have been developed to address this fundamental challenge: [38], [39] high-dim stochastic collocation tensor completion to estimate unknown simulation data [37] hierarchical uncertainty quantification tensor-train decomposition for high-dim integration [33] uncertainty analysis with non-Gaussian correlated uncertainty functional tensor train to compute basis functions [40] spatial variation pattern prediction statistical tensor completion to predict variation pattern resulting generalized polynomial chaos expansion is sparse. This technique has been successfully applied to electronic IC, photonics and MEMS with up to 57 random parameters.…”
Section: B Tensor Methods For Uncertainty Propagationmentioning
confidence: 99%
“…, c M } T are non-negative constants. In this application, we chose M = 20 and the constants c given by Konakli and Sudret (2016a); Kersaudy et al (2015): 2, 5, 10, 20, 50, 100, 500, 500, . . .…”
Section: Sobol' Functionmentioning
confidence: 99%
“…A possible redemption is provided by PLS‐PCE, which couples a nonlinear partial least squares‐based (PLS) order reduction method with a PCE surrogate. For all of the above approaches, the sensitivity indices can be obtained immediately from postprocessing the model coefficients (see Sudret for classical and sparse PCEs, for LRAs and Ehre et al for PLS‐PCEs).…”
Section: Surrogate Models For Uncertain Datamentioning
confidence: 99%