2010
DOI: 10.1111/j.1365-2966.2010.17265.x
|View full text |Cite
|
Sign up to set email alerts
|

Microlensing with an advanced contour integration algorithm: Green's theorem to third order, error control, optimal sampling and limb darkening

Abstract: Microlensing light curves are typically computed either by ray-shooting maps or by contour integration via Green's theorem. We present an improved version of the second method that includes a parabolic correction in Green's line integral. In addition, we present an accurate analytical estimate of the residual errors, which allows the implementation of an optimal strategy for the contour sampling. Finally, we give a prescription for dealing with limb-darkened sources, reaching arbitrary accuracy. These optimiza… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
104
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
5
5

Relationship

2
8

Authors

Journals

citations
Cited by 140 publications
(107 citation statements)
references
References 36 publications
(47 reference statements)
0
104
0
Order By: Relevance
“…Dealing with some of these effects poses no significant difficulties, and requires only simple extensions to the basic SBLM; in particular, however, computing the magnification of an extended (as opposed to point‐like) source can take around two orders of magnitude longer than the corresponding calculations under the point‐source approximation (e.g. Vermaak ; Gould ; Bozza ). Unfortunately, finite‐source effects generally cannot be ignored when dealing with planetary signals (Vermaak ), so in such cases the claims about the algorithm's speed are seemingly invalidated.…”
Section: Discussionmentioning
confidence: 99%
“…Dealing with some of these effects poses no significant difficulties, and requires only simple extensions to the basic SBLM; in particular, however, computing the magnification of an extended (as opposed to point‐like) source can take around two orders of magnitude longer than the corresponding calculations under the point‐source approximation (e.g. Vermaak ; Gould ; Bozza ). Unfortunately, finite‐source effects generally cannot be ignored when dealing with planetary signals (Vermaak ), so in such cases the claims about the algorithm's speed are seemingly invalidated.…”
Section: Discussionmentioning
confidence: 99%
“…This corresponds to where the projected position of the lens is slightly interior to the projected source surface. We calculated the magnification factor with finite source size using the RT-model developed by V. Bozza (Bozza et al 2018;Bozza 2010;Skowron & Gould 2012). The magnification factor decreases for ρ /u > 1.1 and the calculation approaches that of the point-lens with point-source as the finite size of the star becomes diminishingly relevant.…”
Section: General Characterizationsmentioning
confidence: 99%
“…To model such events, we must use a methodology that accounts for the finite source size. The two most common algorithms for this are contour integration via Stokes' Theorem (Gould & Gaucherel 1997;Bozza 2010) and various refinements of ray shooting maps (Kayser, Refsdal & Stabell 1986;Dong et al 2006;Bennett & Rhie 1996). In Sections 4-5 below we describe our strategy, that is based on the magnification map approach.…”
Section: The Binary Microlensing Modelmentioning
confidence: 99%