2020 IEEE 29th Conference on Electrical Performance of Electronic Packaging and Systems (EPEPS) 2020
DOI: 10.1109/epeps48591.2020.9231388
|View full text |Cite
|
Sign up to set email alerts
|

High-Dimensional Uncertainty Quantification via Active and Rank-Adaptive Tensor Regression

Abstract: Fabrication process variations can significantly influence the performance and yield of nano-scale electronic and photonic circuits. Stochastic spectral methods have achieved great success in quantifying the impact of process variations, but they suffer from the curse of dimensionality. Recently, low-rank tensor methods have been developed to mitigate this issue, but two fundamental challenges remain open: how to automatically determine the tensor rank and how to adaptively pick the informative simulation samp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 45 publications
(63 reference statements)
0
6
0
Order By: Relevance
“…47 Dimension-reducing surrogate methods leverage ideas from Gaussian process (GP) regression, 48,49 active subspaces, [50][51][52] kernel-PCA, [53][54][55] polynomial chaos expansions (PCEs), 56 and other numerical schemes based on low-rank tensor decomposition. [57][58][59][60][61] Deep learning motivates another class of methods in dimensionality reduction. Hinton and Salakhutdinov proposed autoencoders to learn a latent representation of training examples in a data-driven way.…”
Section: High-dimensional Uncertainty Qualificationmentioning
confidence: 99%
“…47 Dimension-reducing surrogate methods leverage ideas from Gaussian process (GP) regression, 48,49 active subspaces, [50][51][52] kernel-PCA, [53][54][55] polynomial chaos expansions (PCEs), 56 and other numerical schemes based on low-rank tensor decomposition. [57][58][59][60][61] Deep learning motivates another class of methods in dimensionality reduction. Hinton and Salakhutdinov proposed autoencoders to learn a latent representation of training examples in a data-driven way.…”
Section: High-dimensional Uncertainty Qualificationmentioning
confidence: 99%
“…A framework based on kernel-PCA in conjunction with Kriging or PCE is proposed by Lataniotis et al [73]. Low-rank tensor-based schemes have been also proposed to to address the issue of high dimensional input [74,75].…”
Section: Dimension Reduction For High-dimensional Uqmentioning
confidence: 99%
“…46 Dimension-reducing surrogate methods leverage ideas from Gaussian process (GP) regression 47,48 ; active subspaces 49,50,51 ; kernel-PCA 52,53,54 ; polynomial chaos expansions (PCEs) 55 ; and other numerical schemes based on low-rank tensor decomposition. 56,57,58,59,60 Deep learning motivates another class of methods in dimensionality reduction. Hinton and Salakhutdinov proposed autoencoders to learn a latent representation of training examples in a data-driven way.…”
Section: High-dimensional Uncertainty Qualificationmentioning
confidence: 99%