2021
DOI: 10.48550/arxiv.2107.03066
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Probabilistic partition of unity networks: clustering based deep approximation

Nat Trask,
Mamikon Gulian,
Andy Huang
et al.

Abstract: Partition of unity networks (POU-Nets) have been shown capable of realizing algebraic convergence rates for regression and solution of PDEs, but require empirical tuning of training parameters. We enrich POU-Nets with a Gaussian noise model to obtain a probabilistic generalization amenable to gradient-based minimization of a maximum likelihood loss. The resulting architecture provides spatial representations of both noiseless and noisy data as Gaussian mixtures with closed form expressions for variance which p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(17 citation statements)
references
References 30 publications
1
16
0
Order By: Relevance
“…However, several approaches can be used to bridge this gap, chief among which is a local spline-based interpolation method. In a somewhat related spirit, one can also make use of the recent partition of unity networks proposed in [23,37] that solve the regression problem on complex geometries by computing a partition of unity coupled with high-order polynomial expansions supported on the different partitions. Yet another approach is to make use of an extension algorithm to extend the eigenfunctions to larger, more regular domains, where Legendre expansions are available.…”
Section: Construction Of Custom-made Basis Functions Using Operator N...mentioning
confidence: 99%
“…However, several approaches can be used to bridge this gap, chief among which is a local spline-based interpolation method. In a somewhat related spirit, one can also make use of the recent partition of unity networks proposed in [23,37] that solve the regression problem on complex geometries by computing a partition of unity coupled with high-order polynomial expansions supported on the different partitions. Yet another approach is to make use of an extension algorithm to extend the eigenfunctions to larger, more regular domains, where Legendre expansions are available.…”
Section: Construction Of Custom-made Basis Functions Using Operator N...mentioning
confidence: 99%
“…An exception to the use of multiple ANNs is the method presented in Kharazmi et al (2021), which approximates the PDE with a single ANN, but making use of a separate set of test functions per subdomain. An hp-element-like approximation method based on partition of unity networks is suggested in Lee et al (2020) and Trask et al (2021); however, this method concerns a purely data-driven, supervised learning context, where ground truth values are assumed to exist for the function to be approximated, equivalently, for the solution of the PDE. Thus, the variational neural solver proposed in this paper is distinctly different than the ones presented in the aforementioned works.…”
Section: Introductionmentioning
confidence: 99%
“…12 Moreover, although uncertainty quantification is required in many applications, it may not be naturally produced by the approximation scheme itself. 1 Instead, this work explores a general framework based on combining DNNs with polynomial approximation schemes that provides confidence regions in addition to point estimations.…”
Section: Introductionmentioning
confidence: 99%
“…2 Inspired by the deterministic model 2 and multigrid methods, 24 Trask et al showed that a natural extension of POU-Net, the probabilistic partition of unity network (PPOU-Net), can be interpreted as a mixture of experts (MoE) model, and proposed an expectation-maximization (EM) training strategy as well as a hierarchical architecture to accelerate and improve the conditioning of the training process. 1,25 Classical approximation methods enjoy the advantages of computational efficiency and convergence guarantee in solving local, low-dimensional regression problems, but they often struggle in high dimensions or as global approximants. 26 Examples of such classical methods include truncated expansions in orthogonal polynomials (e.g., Chebyshev polynomials, Legendre polynomials, Hermite polynomials) 27 and Fourier basis functions 28 , rational functions 29 , radial basis functions 30 , splines 31 , wavelets 32 , kernel methods 33 , sparse grid 34 , etc.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation