2018
DOI: 10.48550/arxiv.1809.02362
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations

Abstract: Artificial neural networks (ANNs) have very successfully been used in numerical simulations for a series of computational problems ranging from image classification/image recognition, speech recognition, time series analysis, game intelligence, and computational advertising to numerical approximations of partial differential equations (PDEs). Such numerical simulations suggest that ANNs have the capacity to

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
139
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
5

Relationship

2
8

Authors

Journals

citations
Cited by 77 publications
(142 citation statements)
references
References 57 publications
2
139
0
Order By: Relevance
“…Remark.. • There is a common belief that Machine learning based PDE solvers can break the curse of dimensionality [15,24,40]. However we obtained an n − 2s−2 2s−4+d convergence rate which can become super slow in high dimension.…”
Section: Final Upper Boundmentioning
confidence: 76%
“…Remark.. • There is a common belief that Machine learning based PDE solvers can break the curse of dimensionality [15,24,40]. However we obtained an n − 2s−2 2s−4+d convergence rate which can become super slow in high dimension.…”
Section: Final Upper Boundmentioning
confidence: 76%
“…Since that time there appeared many other versions of the Kolmogorov-Chentsov theorem that essentially allow to treat more general sets Θ. We mention [14, Theorem 2.1], [5, Theorem 3.9], [7,Lemma 2.19], [9, Proposition 3.9] for several recent formulations, where Θ is a subset of R m . Sometimes only the existence of a continuous or Höldercontinuous modification is claimed, which is substantially weaker than (1.4).…”
Section: Introduction and Main Resultsmentioning
confidence: 99%
“…Several theoretical work have been devoted to the above representation question. It has been established in [14,15,20] that deep neural networks can approximate solutions to certain class of parabolic equations and Poisson equation without CoD. The major limitation of those work lies in that the PDEs considered in those work must admit certain stochastic representation such as the Feymann-Kac formula and it seems difficult to generalize the proof techniques to broader classes of PDEs with no probabilistic interpretation.…”
Section: Main Theorem (Informal Version)mentioning
confidence: 99%