1999
DOI: 10.1090/s0025-5718-99-01177-1
|View full text |Cite
|
Sign up to set email alerts
|

Optimal approximation of stochastic differential equations by adaptive step-size control

Abstract: Abstract. We study the pathwise (strong) approximation of scalar stochastic differential equations with respect to the global error in the L 2 -norm. For equations with additive noise we establish a sharp lower error bound in the class of arbitrary methods that use a fixed number of observations of the driving Brownian motion. As a consequence, higher order methods do not exist if the global error is analyzed. We introduce an adaptive step-size control for the Euler scheme which performs asymptotically optimal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2000
2000
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 47 publications
(31 citation statements)
references
References 19 publications
0
31
0
Order By: Relevance
“…Becker and Rannacher [5,6], Eriksson et al [15], Johnson and Szepessy [22], and Moon et al [30], but the theoretical understanding of convergence rates of adaptive algorithms is not as well developed; there are, however, recent important contributions. The work of Hofmann et al [19,20] and Müller-Gronbach [34] prove optimal convergence rates for strong approximation of stochastic differential equations. DeVore studies in [12] the efficiency of adaptive approximation of functions, including wavelet expansions, based on smoothness conditions in Besov spaces.…”
Section: Convergence Rates For Adaptive Approximation 513mentioning
confidence: 99%
“…Becker and Rannacher [5,6], Eriksson et al [15], Johnson and Szepessy [22], and Moon et al [30], but the theoretical understanding of convergence rates of adaptive algorithms is not as well developed; there are, however, recent important contributions. The work of Hofmann et al [19,20] and Müller-Gronbach [34] prove optimal convergence rates for strong approximation of stochastic differential equations. DeVore studies in [12] the efficiency of adaptive approximation of functions, including wavelet expansions, based on smoothness conditions in Besov spaces.…”
Section: Convergence Rates For Adaptive Approximation 513mentioning
confidence: 99%
“…For example in [17], where the step-size depends on the size of the diffusion coefficient for a MSE Euler-Maruyama adaptive algorithm; in [23], the step-size is controlled by the variation in the size of the drift coefficient in the constructed EulerMaruyama adaptive algorithm, which preserves the long-term ergodic behavior of the true solution for many SDE problems; and in [19], a local error based adaptive Milstein algorithm is developed for solving multi-dimensional chemical Langevin equations.…”
Section: Uniform Time-stepping Mlmc Error and Computationalmentioning
confidence: 99%
“…This includes, but is not restricted to, the condition that, for any n and i, t (n) i+1 must be F (t (n) i )-adapted (which requires that the length of each interval would need to be fully determined by the starting time of that step), described in [11, p. 321]. Indeed, there are variable step size numerical schemes for SDEs which implement nonrandom step size selection (see, for instance, [18]) or nonanticipating random-step selection (see, for instance, [8] or [9]). More generally, we may use spatial discretisation schemes such as given in [1] or [16], since first hitting times for a continuous diffusion process are stopping times.…”
Section: Y (T (N) J )(W(t (N) J +1 ) − W(t (N) J ))mentioning
confidence: 99%