2020
DOI: 10.48550/arxiv.2005.06034
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

An adaptive Euler-Maruyama scheme for McKean-Vlasov SDEs with super-linear growth and application to the mean-field FitzHugh-Nagumo model

Abstract: In this paper, we introduce a fully implementable, adaptive Euler-Maruyama scheme for McKean SDEs with non-globally Lipschitz continuous drifts. We prove moment stability of the discretised processes and a strong convergence rate of 1/2. We present several numerical examples centred around a mean-field model for FitzHugh-Nagumo neurons, which illustrate that the standard uniform scheme fails and that the adaptive scheme shows in most cases superior performance compared to tamed approximation schemes. In additi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
18
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(19 citation statements)
references
References 26 publications
1
18
0
Order By: Relevance
“…To the best of our knowledge this has not yet been discussed for MV-SDE schemes in general. The stability of the SSM provides a theoretical foundation for carrying out simulation with larger timestep and we point to positive results by way of numerical simulation with the Cucker-Smale flocking model [22] where the SSM outperforms both taming [18] and adaptive time-stepping [45] algorithms.…”
Section: Motivationmentioning
confidence: 78%
See 4 more Smart Citations
“…To the best of our knowledge this has not yet been discussed for MV-SDE schemes in general. The stability of the SSM provides a theoretical foundation for carrying out simulation with larger timestep and we point to positive results by way of numerical simulation with the Cucker-Smale flocking model [22] where the SSM outperforms both taming [18] and adaptive time-stepping [45] algorithms.…”
Section: Motivationmentioning
confidence: 78%
“…We prove its convergence and recover the 1/2-convergence rate in rMSE under the same general assumptions as the tamed [18] or the adaptive time-stepping method [45]. No differentability or nondegeneracy assumptions are imposed and stopping-time arguments are fully avoided.…”
Section: Motivationmentioning
confidence: 79%
See 3 more Smart Citations