2014
DOI: 10.1007/s00211-014-0684-3
|View full text |Cite
|
Sign up to set email alerts
|

Automatic integration using asymptotically optimal adaptive Simpson quadrature

Abstract: We present a novel theoretical approach to the analysis of adaptive quadratures and adaptive Simpson quadratures in particular which leads to the construction of a new algorithm for automatic integration. For a given function with and possible endpoint singularities the algorithm produces an approximation to within a given asymptotically as . Moreover, it is optimal among all adaptive Simpson quadratures, i.e., needs the minimal number of function evaluations to obtain an -approximation and runs in time p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
13
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 12 publications
(14 citation statements)
references
References 13 publications
1
13
0
Order By: Relevance
“…and the computational cost for minimization is significantly smaller than that for function approximation. The minimization problem (MIN) for functions in the whole Sobolev space W 2,∞ has a similar lower complexity bound as (19) for the function approximation problem by a similar proof. However, for functions only in the cone C, we have not yet derived a lower bound on the complexity of the minimization problem (MIN) for functions in C.…”
Section: The Computational Cost Of Mmentioning
confidence: 89%
“…and the computational cost for minimization is significantly smaller than that for function approximation. The minimization problem (MIN) for functions in the whole Sobolev space W 2,∞ has a similar lower complexity bound as (19) for the function approximation problem by a similar proof. However, for functions only in the cone C, we have not yet derived a lower bound on the complexity of the minimization problem (MIN) for functions in C.…”
Section: The Computational Cost Of Mmentioning
confidence: 89%
“…where L * m,r f denotes the approximation corresponding to (8), are equal. Observe that such a partition exists since the local errors continuously depend on the points x i .…”
Section: Optimal Partitionmentioning
confidence: 99%
“…On the other hand, if the function is smooth in the whole domain, then adaptive algorithms can improve the error only by a constant compared to nonadaptive algorithms. The exact asymptotic constants for quadratures of degree of exactness r − 1 and for functions f ∈ C r ([a, b]) with f (r) > 0 were obtained in [8] for r = 4 and in [2,3] for arbitrary r. Procedures corresponding to the optimal strategies for automatic integration were also proposed.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, we rigorously discuss the accuracy and cost of an adaptive process for a well precised class of problems, not only for a number of computational examples. For the integration of scalar C 4 functions similar questions have recently been addressed for the Simpson rule in [9], where it is shown that the adaptive mesh selection allows us to reduce the error by reducing the asymptotic constant of the method. Adaptive mesh points for the approximation of univariate W 2,∞ functions is discussed in [1] .…”
Section: Introductionmentioning
confidence: 95%