We introduce redMaGiC, an automated algorithm for selecting Luminous Red Galaxies (LRGs). The algorithm was specifically developed to minimize photometric redshift uncertainties in photometric large-scale structure studies. redMaGiC achieves this by self-training the color-cuts necessary to produce a luminosity-thresholded LRG sample of constant comoving density. We demonstrate that redMaGiC photo-zs are very nearly as accurate as the best machine-learning based methods, yet they require minimal spectroscopic training, do not suffer from extrapolation biases, and are very nearly Gaussian. We apply our algorithm to Dark Energy Survey (DES) Science Verification (SV) data to produce a redMaGiC catalog sampling the redshift range z ∈ [0.2,0.8].Our fiducial sample has a comoving space density of 10 −3 (h −1 Mpc) −3 , and a median photo-z bias (z spec − z photo ) and scatter (σ z /(1 + z)) of 0.005 and 0.017 respectively. The corresponding 5σ outlier fraction is 1.4%. We also test our algorithm with Sloan Digital Sky Survey (SDSS) Data Release 8 (DR8) and Stripe 82 data, and discuss how spectroscopic training can be used to control photo-z biases at the 0.1% level.
Ongoing and near-future imaging-based dark energy experiments are critically dependent upon photometric redshifts (a.k.a. photo-z's): i.e., estimates of the redshifts of objects based only on flux information obtained through broad filters. Higher-quality, lower-scatter photo-z's will result in smaller random errors on cosmological parameters; while systematic errors in photometric redshift estimates, if not constrained, may dominate all other uncertainties from these experiments. The desired optimization and calibration is dependent upon spectroscopic measurements for secure redshift information; this is the key application of galaxy spectroscopy for imaging-based dark energy experiments.Hence, to achieve their full potential, imaging-based experiments will require large sets of objects with spectroscopically-determined redshifts, for two purposes:• Training: Objects with known redshift are needed to map out the relationship between object color and z (or, equivalently, to determine empirically-calibrated templates describing the restframe spectra of the full range of galaxies, which may be used to predict the color-z relation). The ultimate goal of training is to minimize each moment of the distribution of differences between photometric redshift estimates and the true redshifts of objects, making the relationship between them as tight as possible. The larger and more complete our "training set" of spectroscopic redshifts is, the smaller the RMS photo-z errors should be, increasing the constraining power of imaging experiments.Requirements: Spectroscopic redshift measurements for ∼30,000 objects over >∼15 widelyseparated regions, each at least ∼20 arcmin in diameter, and reaching the faintest objects used in a given experiment, will likely be necessary if photometric redshifts are to be trained and calibrated with conventional techniques. Larger, more complete samples (i.e., with longer exposure times) can improve photo-z algorithms and reduce scatter further, enhancing the science return from planned experiments greatly (increasing the Dark Energy Task Force figure of merit by up to ∼50%).
This white paper describes the LSST Dark Energy Science Collaboration (DESC), whose goal is the study of dark energy and related topics in fundamental physics with data from the Large Synoptic Survey Telescope (LSST). It provides an overview of dark energy science and describes the current and anticipated state of the field. It makes the case for the DESC by laying out a robust analytical framework for dark energy science that has been defined by its members and the comprehensive three-year work plan they have developed for implementing that framework. The analysis working groups cover five key probes of dark energy: weak lensing, large scale structure, galaxy clusters, Type Ia supernovae, and strong lensing. The computing working groups span cosmological simulations, galaxy catalogs, photon simulations and a systematic software and computational framework for LSST dark energy data analysis. The technical working groups make the connection between dark energy science and the LSST system. The working groups have close linkages, especially through the use of the photon simulations to study the impact of instrument design and survey strategy on analysis methodology and cosmological parameter estimation. The white paper describes several high priority tasks identified by each of the 16 working groups. Over the next three years these tasks will help prepare for LSST analysis, make synergistic connections with ongoing cosmological surveys and provide the dark energy community with state of the art analysis tools. Members of the community are invited to join the DESC, according to the membership policies described in the white paper. Applications to sign up for associate membership may be made by submitting the Web form at http://www.slac.stanford.edu/exp/lsst/desc/signup.html with a short statement of the work they wish to pursue that is relevant to the DESC.3
We present weak lensing mass estimates of seven shear-selected galaxy cluster candidates from the Deep Lens Survey. The clusters were previously identified as mass peaks in convergence maps of 8.6 deg 2 of R band imaging, and followed up with X-ray and spectroscopic confirmation, spanning a redshift range 0.19-0.68. Most clusters contained multiple X-ray peaks, yielding 17 total mass concentrations. In this paper, we constrain the masses of these X-ray sources with weak lensing, using photometric redshifts from the full set of BV Rz ′ imaging to properly weight background galaxies according to their lensing distance ratios. We fit both NFW and singular isothermal sphere profiles, and find that the results are insensitive to the assumed profile. We also show that the results do not depend significantly on the assumed prior on the position of the mass peak, but that this may become an issue in future larger samples. The inferred velocity dispersions for the extended X-ray sources range from 250-800 km s −1 , with the exception of one source for which no lensing signal was found. This work further establishes shear selection as a viable technique for finding clusters, but also highlights some unresolved issues such as determination of the mass profile center without biasing the mass estimate, and fully accounting for line-of-sight projections. A follow-up paper will examine the mass-X-ray scaling relations of these clusters.
In this paper, we search for a signature of a large‐scale bulk flow by looking for fluctuations in the magnitudes of distant luminous red galaxies (LRGs). We take a sample of LRGs from the Sloan Digital Sky Survey with redshifts of z > 0.08 over a contiguous area of the sky. Neighbouring LRG magnitudes are averaged together to find the fluctuation in magnitudes as a function of right ascension. The result is a fluctuation of a few per cent in flux across roughly 100°. The source of this fluctuation could be from a large dipole motion with respect to the LRG sample or a systematic in our treatment of the data set, or the data set itself. A dipole model is fitted to the observed fluctuation, and the three flow parameters, its direction (αb, δb) and magnitude (vb), are constrained . We find that the flow direction is consistent with the direction found by other authors, with αb∼ 180 and δb∼−50. The flow magnitude, however, was found to be anomalously large, with vb > 4000 km s−1. The LRG angular selection function cannot be sufficiently taken into account in our analysis with the available data, and may be the source of either the anomalous magnitude of the flow signal or possibly the entire fluctuation. However, the fluctuation indicates a flow direction very close to those found using other data sets and analyses. Further investigation with upcoming data is required to confirm this detection.
Peculiar velocities of galaxies hosting Type Ia supernovae (SNe Ia) generate a significant systematic effect in deriving the dark energy equation of state w, at level of a few per cent. Here, we illustrate how the peculiar velocity effect in SNe Ia data can be turned from a 'systematic' into a probe of cosmological parameters. We assume a flat cold dark matter model (w = −1) and use low-and high-redshift SNe Ia data to derive simultaneously three distinct estimates of the matter density m which appear in the problem: from the geometry, from the dynamics and from the shape of the matter power spectrum. We find that each of the three m 's agrees with the canonical value m = 0.25 to within 1σ , for reasonably assumed fluctuation amplitude and Hubble parameter. This is consistent with the standard cosmological scenario for both the geometry and the growth of structure. For fixed m = 0.25 for all three m 's, we constrain γ = 0.72 ± 0.21 in the growth factor of m (z) γ , so we cannot currently distinguish between standard Einstein gravity and predictions from some modified gravity models. Future surveys of thousands of SNe Ia, or inclusion of peculiar velocity data, could significantly improve the above tests.
Context. In the next decade, the Large Synoptic Survey Telescope (LSST) will become a major facility for the astronomical community. However, accurately determining the redshifts of the observed galaxies without using spectroscopy is a major challenge. Aims. Reconstruction of the redshifts with high resolution and well-understood uncertainties is mandatory for many science goals, including the study of baryonic acoustic oscillations (BAO). We investigate different approaches to establish the accuracy that can be reached by the LSST six-band photometry. Methods. We construct a realistic mock galaxy catalog, based on the Great Observatories Origins Deep Survey (GOODS) luminosity function, by simulating the expected apparent magnitude distribution for the LSST. To reconstruct the photometric redshifts (photo-z's), we consider a template-fitting method and a neural network method. The photo-z reconstruction from both of these techniques is tested on real Canada-France-Hawaii Telescope Legacy Survey (CFHTLS) data and also on simulated catalogs. We describe a new method to improve photometric redshift reconstruction that efficiently removes catastrophic outliers via a likelihood ratio statistical test. This test uses the posterior probability functions of the fit parameters and the colors. Results. We show that the photometric redshift accuracy will meet the stringent LSST requirements up to redshift ∼2.5 after a selection that is based on the likelihood ratio test or on the apparent magnitude for galaxies with signal-to-noise ratio S /N > 5 in at least 5 bands. The former selection has the advantage of retaining roughly 35% more galaxies for a similar photo-z performance compared to the latter. Photo-z reconstruction using a neural network algorithm is also described. In addition, we utilize the CFHTLS spectro-photometric catalog to outline the possibility of combining the neural network and template-fitting methods. Conclusions. We demonstrate that the photometric redshifts will be accurately estimated with the LSST if a Bayesian prior probability and a calibration sample are used.
We present cosmological parameter constraints from the SFI++ galaxy peculiar velocity survey, the largest galaxy peculiar velocity sample to date. The analysis is performed by using the gridding method developed in Abate et al. We concentrate on constraining parameters which are affected by the clustering of matter: σ8 and the growth index γ. Assuming a concordance Λ cold dark matter (ΛCDM) model, we find σ8= 0.91+0.22−0.18 and γ= 0.55+0.13−0.14 after marginalizing over Ωm. These constraints are consistent with, and have constraining power similar to, the same constraints from other current data sets which use different methods. Recently, there have been several claims that the peculiar velocity measurements do not agree with ΛCDM. Instead, we find that, although a higher value of σ8 and a lower value of Ωm are preferred, the values are still consistent when compared with Wilkinson Microwave Anisotropy Probe 5 results. We note that although our analysis probes a variety of scales, the constraints will be dominated by the smaller scales, which have the smallest uncertainties. These results show that peculiar velocity analysis is a vital probe of cosmology, providing competitive constraints on parameters such as σ8. Its sensitivity to the derivative of growth function, particularly down to redshift zero, means that it can provide a vital low redshift anchor on the evolution of structure formation. The importance of utilizing different probes with varying systematics is also an essential requirement for providing a consistency check on the best‐fitting cosmological model.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.