2019
DOI: 10.3847/1538-4357/ab459f
|View full text |Cite
|
Sign up to set email alerts
|

Multiscale Time- and Frequency-domain Likelihood Analysis with Photon Weights

Abstract: We present an unbinned likelihood analysis formalism employing photon weights-the probabilities that events are associated with a particular source. This approach is applicable to any photon-resolving instrument, and thus well suited to high-energy observations; we focus here on GeV γ-ray data from the Fermi Large Area Telescope. Weights connect individual photons to the outputs of a detailed, expensive likelihood analysis of a much larger data set. The weighted events can be aggregated into arbitrary time spa… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 18 publications
(22 citation statements)
references
References 37 publications
0
22
0
Order By: Relevance
“…To measure the source flux variability (instead of the observed count rate variability), the background and instrument sensitivity at any time needs to be incorporated. While the original count event formulation of Bayesian blocks can incorporate varying sensitivity, contributions from a potentially varying background deduced by an 'off' region go beyond the current work (but see Kerr 2019, for Bayesian blocks extensions) However, when the Gaussian approximation is used in a pre-binned light curve, we can infer the (background-corrected) source count rate at each time bin, (X i , σ i ). The Bayesian block formulation using the Gaussian error bars can then attempt to detect changes.…”
Section: Bayesian Blocksmentioning
confidence: 99%
“…To measure the source flux variability (instead of the observed count rate variability), the background and instrument sensitivity at any time needs to be incorporated. While the original count event formulation of Bayesian blocks can incorporate varying sensitivity, contributions from a potentially varying background deduced by an 'off' region go beyond the current work (but see Kerr 2019, for Bayesian blocks extensions) However, when the Gaussian approximation is used in a pre-binned light curve, we can infer the (background-corrected) source count rate at each time bin, (X i , σ i ). The Bayesian block formulation using the Gaussian error bars can then attempt to detect changes.…”
Section: Bayesian Blocksmentioning
confidence: 99%
“…In order to draw firm conclusions in the future, magnetic reconnection simulations over longer time scales would be highly desirable. Additionally, a PSD analysis of unbinned Fermi data could help to probe frequencies below the chosen time binning (e.g., Kerr 2019). All models discussed so far were based on simulations of reconnection in plasmas with fixed magnetization σ = 10.…”
Section: Discussionmentioning
confidence: 99%
“…Thus, when a source is brighter than the model predicts, the probability of a photon coming from the source is underestimated, and, when the source is fainter than the model prediction, the probability is overestimated (e.g. Kerr 2019). This results in a reduction of the apparent amplitude of any variability.…”
Section: Gamma-ray Observations and Analysismentioning
confidence: 99%
“…The light curves were obtained using a variant of aperture photometry where we estimate the probability "p" that each photon comes from a source of interest and sum these probabilities (e.g. Kerr 2011;Fermi LAT Collaboration et al 2012;Kerr 2019). To facilitate this, model files were created for each source using make4FGLxml including sources from the 4FGL DR2 catalog within a 10 degree radius from the source.…”
Section: Gamma-ray Observations and Analysismentioning
confidence: 99%