2021
DOI: 10.48550/arxiv.2102.03657
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural SDEs as Infinite-Dimensional GANs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 0 publications
0
9
0
Order By: Relevance
“…This refines many universal approximation theorems in the literature. For instance, it refines [148] which concerns functions defined on the unit cube, it provides a quantitative version of [85], and it parallels the findings of [129] beyond the case where K is a differentiable sub-manifold of R n ; all while allowing for trainable activation functions. Note that in the special case where R m = R, K = [0, 1] n , and σ is of singular-ReLU type, we recover [130, Theorem 1].…”
Section: Static Case -Universal Approximation Into Qas Spacesmentioning
confidence: 95%
See 1 more Smart Citation
“…This refines many universal approximation theorems in the literature. For instance, it refines [148] which concerns functions defined on the unit cube, it provides a quantitative version of [85], and it parallels the findings of [129] beyond the case where K is a differentiable sub-manifold of R n ; all while allowing for trainable activation functions. Note that in the special case where R m = R, K = [0, 1] n , and σ is of singular-ReLU type, we recover [130, Theorem 1].…”
Section: Static Case -Universal Approximation Into Qas Spacesmentioning
confidence: 95%
“…Other than direct path generation, neural networks have been employed as function approximators for drift and diffusions of the modelled SDE system, for robust and data-driven model selection mechanisms, as e.g. done in [7,42,54,146,39,85,138].…”
mentioning
confidence: 99%
“…which is approximated with the conditional GAN. The approximation of S t+∆t | S t is then obtained by inverting Equation (22). Since values of the process can get arbitrarily close to zero, the generator may output negative values very close to 0.…”
Section: Data Pre-processingmentioning
confidence: 99%
“…Another approach is to apply the 'neural ODEs' by Chen et al [21] on SDEs. Kidger et al [22] 'fit' SDEs to time series data, where the SDE coefficients are given by NNs. A GAN architecture is used here as well, where the solution to the SDE defines the output of the 'neural SDE'.…”
Section: Introductionmentioning
confidence: 99%
“…The task of solving random partial differential equations was recently addressed with deep learning using physics-informed neural networks [17][18][19]. Unsupervised generative modelling often requires using adversarial training [20,21] and generative adversarial network (GAN) architecture [22]. Moreover, in cases where the information about system parameters is limited, or exemplary datasets are generated from measurement and not known initial conditions, we arrive at the task of equation discovery [23][24][25][26][27].…”
Section: Introductionmentioning
confidence: 99%