We present an up-to-date, comprehensive summary of the rates for all types of compact binary coalescence sources detectable by the initial and advanced versions of the ground-based gravitational-wave detectors LIGO and Virgo. Astrophysical estimates for compact-binary coalescence rates depend on a number of assumptions and unknown model parameters and are still uncertain. The most confident among these estimates are the rate predictions for coalescing binary neutron stars which are based on extrapolations from observed binary pulsars in our galaxy. These yield a likely coalescence rate of 100 Myr−1 per Milky Way Equivalent Galaxy (MWEG), although the rate could plausibly range from 1 Myr−1 MWEG−1 to 1000 Myr−1 MWEG−1 (Kalogera et al 2004 Astrophys. J. 601 L179; Kalogera et al 2004 Astrophys. J. 614 L137 (erratum)). We convert coalescence rates into detection rates based on data from the LIGO S5 and Virgo VSR2 science runs and projected sensitivities for our advanced detectors. Using the detector sensitivities derived from these data, we find a likely detection rate of 0.02 per year for Initial LIGO–Virgo interferometers, with a plausible range between 2 × 10−4 and 0.2 per year. The likely binary neutron–star detection rate for the Advanced LIGO–Virgo network increases to 40 events per year, with a range between 0.4 and 400 per year.
The Advanced LIGO and Advanced Virgo gravitational wave (GW) detectors will begin operation in the coming years, with compact binary coalescence events a likely source for the first detections. The gravitational waveforms emitted directly encode information about the sources, including the masses and spins of the compact objects. Recovering the physical parameters of the sources from the GW observations is a key analysis task. This work describes the LALInference software library for Bayesian parameter estimation of compact binary signals, which builds on several previous methods to provide a well-tested toolkit which has already been used for several studies.We show that our implementation is able to correctly recover the parameters of compact binary signals from simulated data from the advanced GW detectors. We demonstrate this with a detailed comparison on three compact binary systems: a binary neutron star (BNS), a neutron star -black hole binary (NSBH) and a binary black hole (BBH), where we show a cross-comparison of results obtained using three independent sampling algorithms. These systems were analysed with nonspinning, aligned spin and generic spin configurations respectively, showing that consistent results can be obtained even with the full 15-dimensional parameter space of the generic spin configurations.We also demonstrate statistically that the Bayesian credible intervals we recover correspond to frequentist confidence intervals under correct prior assumptions by analysing a set of 100 signals drawn from the prior.We discuss the computational cost of these algorithms, and describe the general and problemspecific sampling techniques we have used to improve the efficiency of sampling the compact binary coalescence (CBC) parameter space.
On September 14, 2015, the Laser Interferometer Gravitational-Wave Observatory (LIGO) detected a gravitational-wave transient (GW150914); we characterize the properties of the source and its parameters. The data around the time of the event were analyzed coherently across the LIGO network using a suite of accurate waveform models that describe gravitational waves from a compact binary system in general relativity. GW150914 was produced by a nearly equal mass binary black hole of masses 36 þ5 −4 M ⊙ and 29 þ4 −4 M ⊙ ; for each parameter we report the median value and the range of the 90% credible interval. The dimensionless spin magnitude of the more massive black hole is bound to be < 0.7 (at 90% probability). The luminosity distance to the source is 410 −0.07 . This black hole is significantly more massive than any other inferred from electromagnetic observations in the stellar-mass regime.
Nearly a century after Einstein first predicted the existence of gravitational waves, a global network of Earth-based gravitational wave observatories [1, 2, 3, 4] is seeking to directly detect this faint radiation using precision laser interferometry. Photon shot noise, due to the quantum nature of light, imposes a fundamental limit on the attometre-level sensitivity of the kilometre-scale Michelson interferometers deployed for this task. Here, we inject squeezed states to improve the performance of one of the detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) beyond the quantum noise limit, most notably in the frequency region down to 150 Hz, critically important for several astrophysical sources, with no deterioration of performance observed at any frequency. With the injection of squeezed states, this LIGO detector demonstrated the best broadband sensitivity to gravitational waves ever achieved, with important implications for observing the gravitational-wave Universe with unprecedented sensitivity
Around the globe several observatories are seeking the first direct detection of gravitational waves (GWs). These waves are predicted by Einstein's general theory of relativity and are generated, for example, by black-hole binary systems. Present GW detectors are Michelson-type kilometre-scale laser interferometers measuring the distance changes between mirrors suspended in vacuum. The sensitivity of these detectors at frequencies above several hundred hertz is limited by the vacuum (zero-point) fluctuations of the electromagnetic field. A quantum technology--the injection of squeezed light--offers a solution to this problem. Here we demonstrate the squeezed-light enhancement of GEO600, which will be the GW observatory operated by the LIGO Scientific Collaboration in its search for GWs for the next 3-4 years. GEO600 now operates with its best ever sensitivity, which proves the usefulness of quantum entanglement and the qualification of squeezed light as a key technology for future GW astronomy
To our knowledge, no previous meta-analysis has attempted to compare the efficacy of pharmacological, psychological and combined treatments for the three main anxiety disorders (panic disorder, generalized anxiety disorder and social phobia). Pre-post and treated versus control effect sizes (ES) were calculated for all evaluable randomized-controlled studies (n = 234), involving 37,333 patients. Medications were associated with a significantly higher average pre-post ES [Cohen's d = 2.02 (1.90-2.15); 28,051 patients] than psychotherapies [1.22 (1.14-1.30); 6992 patients; P < 0.0001]. ES were 2.25 for serotonin-noradrenaline reuptake inhibitors (n = 23 study arms), 2.15 for benzodiazepines (n = 42), 2.09 for selective serotonin reuptake inhibitors (n = 62) and 1.83 for tricyclic antidepressants (n = 15). ES for psychotherapies were mindfulness therapies, 1.56 (n = 4); relaxation, 1.36 (n = 17); individual cognitive behavioural/exposure therapy (CBT), 1.30 (n = 93); group CBT, 1.22 (n = 18); psychodynamic therapy 1.17 (n = 5); therapies without face-to-face contact (e.g. Internet therapies), 1.11 (n = 34); eye movement desensitization reprocessing, 1.03 (n = 3); and interpersonal therapy 0.78 (n = 4). The ES was 2.12 (n = 16) for CBT/drug combinations. Exercise had an ES of 1.23 (n = 3). For control groups, ES were 1.29 for placebo pills (n = 111), 0.83 for psychological placebos (n = 16) and 0.20 for waitlists (n = 50). In direct comparisons with control groups, all investigated drugs, except for citalopram, opipramol and moclobemide, were significantly more effective than placebo. Individual CBT was more effective than waiting list, psychological placebo and pill placebo. When looking at the average pre-post ES, medications were more effective than psychotherapies. Pre-post ES for psychotherapies did not differ from pill placebos; this finding cannot be explained by heterogeneity, publication bias or allegiance effects. However, the decision on whether to choose psychotherapy, medications or a combination of the two should be left to the patient as drugs may have side effects, interactions and contraindications.
The random-effects or normal-normal hierarchical model is commonly utilized in a wide range of meta-analysis applications. A Bayesian approach to inference is very attractive in this context, especially when a meta-analysis is based only on few studies. The bayesmeta R package provides readily accessible tools to perform Bayesian meta-analyses and generate plots and summaries, without having to worry about computational details. It allows for flexible prior specification and instant access to the resulting posterior distributions, including prediction and shrinkage estimation, and facilitating for example quick sensitivity checks. The present paper introduces the underlying theory and showcases its usage.
Meta‐analyses in orphan diseases and small populations generally face particular problems, including small numbers of studies, small study sizes and heterogeneity of results. However, the heterogeneity is difficult to estimate if only very few studies are included. Motivated by a systematic review in immunosuppression following liver transplantation in children, we investigate the properties of a range of commonly used frequentist and Bayesian procedures in simulation studies. Furthermore, the consequences for interval estimation of the common treatment effect in random‐effects meta‐analysis are assessed. The Bayesian credibility intervals using weakly informative priors for the between‐trial heterogeneity exhibited coverage probabilities in excess of the nominal level for a range of scenarios considered. However, they tended to be shorter than those obtained by the Knapp–Hartung method, which were also conservative. In contrast, methods based on normal quantiles exhibited coverages well below the nominal levels in many scenarios. With very few studies, the performance of the Bayesian credibility intervals is of course sensitive to the specification of the prior for the between‐trial heterogeneity. In conclusion, the use of weakly informative priors as exemplified by half‐normal priors (with a scale of 0.5 or 1.0) for log odds ratios is recommended for applications in rare diseases. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.