A commonly used measure to summarize the nature of a photon spectrum is the so-called hardness ratio, which compares the numbers of counts observed in different passbands. The hardness ratio is especially useful to distinguish between and categorize weak sources as a proxy for detailed spectral fitting. However, in this regime classical methods of error propagation fail, and the estimates of spectral hardness become unreliable. Here we develop a rigorous statistical treatment of hardness ratios that properly deals with detected photons as independent Poisson random variables and correctly deals with the non-Gaussian nature of the error propagation. The method is Bayesian in nature and thus can be generalized to carry out a multitude of source-populationYbased analyses. We verify our method with simulation studies and compare it with the classical method. We apply this method to real-world examples, such as the identification of candidate quiescent low-mass X-ray binaries in globular clusters and tracking the time evolution of a flare on a low-mass star.
EVAR can be performed in patients unfit for open surgical repair with excellent early survival and long-term durability. These outcomes in the HR group compare more favorably to the EVAR-2 trial data. However, not all HR patients for open surgical repair derive the benefit from EVAR. The decision to proceed with EVAR in HR patients should be individualized, depending on the number and severity of risk factors.
Most OCs after EVAR are associated with significant morbidity and mortality, except when electively treating an isolated type II endoleak with ligation of branches and preservation of the endograft.
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages.Here we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can be applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data.Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.
Extended hours of sales and consumption of alcohol were associated with increased risk of homicides. Strong restrictions on alcohol availability could reduce the incidence of interpersonal violence events in communities where homicides are high.
Binary expression systems like the LexA-LexAop system provide a powerful experimental tool kit to study gene and tissue function in developmental biology, neurobiology, and physiology. However, the number of well-defined LexA enhancer trap insertions remains limited. In this study, we present the molecular characterization and initial tissue expression analysis of nearly 100 novel StanEx LexA enhancer traps, derived from the
StanEx
1
index line. This includes 76 insertions into novel, distinct gene loci not previously associated with enhancer traps or targeted LexA constructs. Additionally, our studies revealed evidence for selective transposase-dependent replacement of a previously-undetected
KP
element on chromosome III within the StanEx
1
genetic background during hybrid dysgenesis, suggesting a molecular basis for the over-representation of LexA insertions at the
NK7.1
locus in our screen. Production and characterization of novel fly lines were performed by students and teachers in experiment-based genetics classes within a geographically diverse network of public and independent high schools. Thus, unique partnerships between secondary schools and university-based programs have produced and characterized novel genetic and molecular resources in
Drosophila
for open-source distribution, and provide paradigms for development of science education through experience-based pedagogy.
Among the computationally intensive methods for fitting complex multilevel models, the Gibbs sampler is especially popular owing to its simplicity and power to effectively generate samples from a high-dimensional probability distribution. The Gibbs sampler, however, is often justifiably criticized for its sometimes slow convergence, especially when it is used to fit highly structured complex models. The recently proposed Partially Collapsed Gibbs (PCG) sampler offers a new strategy for improving the convergence characteristics of a Gibbs sampler. A PCG sampler achieves faster convergence by reducing the conditioning in some or all of the component draws of its parent Gibbs sampler. Although this strategy can significantly improve convergence, it must be implemented with care to be sure that the desired stationary distribution is preserved. In some cases the set of conditional distributions sampled in a PCG sampler may be functionally incompatible and permuting the order of draws can change the stationary distribution of the chain. In this article, we draw an analogy between the PCG sampler and certain efficient EM-type algorithms that helps to explain the computational advantage of PCG samplers and to suggest when they might be used in practice. We go on to illustrate the PCG samplers in three substantial examples drawn from our applied work: a multilevel spectral model commonly used in high-energy astrophysics, a piecewise-constant multivariate time series model, and a joint imputation model for nonnested data. These are all useful highly structured models that involve computational challenges that can be solved using PCG samplers. The examples illustrate not only the computation advantage of PCG samplers but also how they should be constructed to maintain the desired stationary distribution. Supplemental materials for the examples given in this article are available online.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.