We use the Fisher-matrix formalism to investigate whether the galaxy bispectrum in redshift space, B, contains additional cosmological information with respect to the power spectrum, P. We focus on a Euclid -like survey and consider cosmological models dominated by dark energy and cold dark matter with Gaussian primordial perturbations. After discussing the phenomenology of redshift-space distortions for the bispectrum, we derive an expression for the cross-covariance between B and P at leading order in perturbation theory. Our equation generalizes previous results that did not consider binning in the orientation of wavevector triangles with respect to the line of sight. By considering Fourier modes with wavenumber k < 0.15 h Mpc −1 , we find that B and P set similar constraints on the cosmological parameters. Generally, error bars moderately improve when the two probes are combined together. For instance, the joint 68.3 per cent credible region for the parameters that describe a dynamical dark-energy equation of state shrinks by a factor of 2.6 with respect to only using the power spectrum. Regrettably, this improvement is cancelled out when the clustering analysis is combined with priors based on current studies of the cosmic microwave background. In this case, combining B and P does not give any appreciable benefit other than allowing a precise determination of galaxy bias. Finally, we discuss how results depend on the binning strategy for the clustering statistics as well as on the maximum wavenumber. We also show that only considering the bispectrum monopole leads to a significant loss of information.the nature of dark energy and dark matter, (ii) the neutrino masses, (iii) the statistical properties of primordial density fluctuations. These are the main science drivers of the planned next generation of surveys that will be conducted, for instance, with the Dark Energy Spectroscopic Instrument (DESI, DESI Collaboration et al. 2016a,b), the Euclid satellite (Laureijs et al. 2011) and the Square Kilometre Array (SKA, Maartens et al. 2015).It is customary to extract cosmological information from galaxy catalogues using the two-point correlation function or its Fourier transform, the power spectrum. Either of these functions fully characterize a zero-mean Gaussian random field. However, the galaxy distribution displays complex patterns characterized by elongated filaments, compact clusters, and volume-filling underdense regions. These features are not captured by two-point statistics that do not retain information on the phases of the Fourier modes of the galaxy distribution. Therefore, if measured with sufficient accuracy and precision, higher-order statistics like the n-point correlation functions (with n > 2) and their Fourier transforms, the polyspectra, should contain additional information.Until recently, galaxy redshift surveys could only pro-
Aims. The Euclid space telescope will measure the shapes and redshifts of galaxies to reconstruct the expansion history of the Universe and the growth of cosmic structures. The estimation of the expected performance of the experiment, in terms of predicted constraints on cosmological parameters, has so far relied on various individual methodologies and numerical implementations, which were developed for different observational probes and for the combination thereof. In this paper we present validated forecasts, which combine both theoretical and observational ingredients for different cosmological probes. This work is presented to provide the community with reliable numerical codes and methods for Euclid cosmological forecasts. Methods. We describe in detail the methods adopted for Fisher matrix forecasts, which were applied to galaxy clustering, weak lensing, and the combination thereof. We estimated the required accuracy for Euclid forecasts and outline a methodology for their development. We then compare and improve different numerical implementations, reaching uncertainties on the errors of cosmological parameters that are less than the required precision in all cases. Furthermore, we provide details on the validated implementations, some of which are made publicly available, in different programming languages, together with a reference training-set of input and output matrices for a set of specific models. These can be used by the reader to validate their own implementations if required. Results. We present new cosmological forecasts for Euclid. We find that results depend on the specific cosmological model and remaining freedom in each setting, for example flat or non-flat spatial cosmologies, or different cuts at non-linear scales. The numerical implementations are now reliable for these settings. We present the results for an optimistic and a pessimistic choice for these types of settings. We demonstrate that the impact of cross-correlations is particularly relevant for models beyond a cosmological constant and may allow us to increase the dark energy figure of merit by at least a factor of three.
Context. In metric theories of gravity with photon number conservation, the luminosity and angular diameter distances are related via the Etherington relation, also known as the distance duality relation (DDR). A violation of this relation would rule out the standard cosmological paradigm and point to the presence of new physics. Aims. We quantify the ability of Euclid, in combination with contemporary surveys, to improve the current constraints on deviations from the DDR in the redshift range 0 < z < 1.6. Methods. We start with an analysis of the latest available data, improving previously reported constraints by a factor of 2.5. We then present a detailed analysis of simulated Euclid and external data products, using both standard parametric methods (relying on phenomenological descriptions of possible DDR violations) and a machine learning reconstruction using genetic algorithms. Results. We find that for parametric methods Euclid can (in combination with external probes) improve current constraints by approximately a factor of six, while for non-parametric methods Euclid can improve current constraints by a factor of three. Conclusions. Our results highlight the importance of surveys like Euclid in accurately testing the pillars of the current cosmological paradigm and constraining physics beyond the standard cosmological model.
Context. The data from the Euclid mission will enable the measurement of the angular positions and weak lensing shapes of over a billion galaxies, with their photometric redshifts obtained together with ground-based observations. This large dataset, with well-controlled systematic effects, will allow for cosmological analyses using the angular clustering of galaxies (GCph) and cosmic shear (WL). For Euclid, these two cosmological probes will not be independent because they will probe the same volume of the Universe. The cross-correlation (XC) between these probes can tighten constraints and is therefore important to quantify their impact for Euclid. Aims. In this study, we therefore extend the recently published Euclid forecasts by carefully quantifying the impact of XC not only on the final parameter constraints for different cosmological models, but also on the nuisance parameters. In particular, we aim to decipher the amount of additional information that XC can provide for parameters encoding systematic effects, such as galaxy bias, intrinsic alignments (IAs), and knowledge of the redshift distributions. Methods. We follow the Fisher matrix formalism and make use of previously validated codes. We also investigate a different galaxy bias model, which was obtained from the Flagship simulation, and additional photometric-redshift uncertainties; we also elucidate the impact of including the XC terms on constraining these latter. Results. Starting with a baseline model, we show that the XC terms reduce the uncertainties on galaxy bias by ∼17% and the uncertainties on IA by a factor of about four. The XC terms also help in constraining the γ parameter for minimal modified gravity models. Concerning galaxy bias, we observe that the role of the XC terms on the final parameter constraints is qualitatively the same irrespective of the specific galaxy-bias model used. For IA, we show that the XC terms can help in distinguishing between different models, and that if IA terms are neglected then this can lead to significant biases on the cosmological parameters. Finally, we show that the XC terms can lead to a better determination of the mean of the photometric galaxy distributions. Conclusions. We find that the XC between GCph and WL within the Euclid survey is necessary to extract the full information content from the data in future analyses. These terms help in better constraining the cosmological model, and also lead to a better understanding of the systematic effects that contaminate these probes. Furthermore, we find that XC significantly helps in constraining the mean of the photometric-redshift distributions, but, at the same time, it requires more precise knowledge of this mean with respect to single probes in order not to degrade the final “figure of merit”.
We provide a comparison between the matter bispectrum derived with different flavours of perturbation theory at next-to-leading order and measurements from an unprecedentedly large suite of N-body simulations. We use the χ2 goodness-of-fit test to determine the range of accuracy of the models as a function of the volume covered by subsets of the simulations. We find that models based on the effective-field-theory (EFT) approach have the largest reach, standard perturbation theory has the shortest, and ‘classical’ resummed schemes lie in between. The gain from EFT, however, is less than in previous studies. We show that the estimated range of accuracy of the EFT predictions is heavily influenced by the procedure adopted to fit the amplitude of the counterterms. For the volumes probed by galaxy redshift surveys, our results indicate that it is advantageous to set three counterterms of the EFT bispectrum to zero and measure the fourth from the power spectrum. We also find that large fluctuations in the estimated reach occur between different realisations. We conclude that it is difficult to unequivocally define a range of accuracy for the models containing free parameters. Finally, we approximately account for systematic effects introduced by the N-body technique either in terms of a scale- and shape-dependent bias or by boosting the statistical error bars of the measurements (as routinely done in the literature). We find that the latter approach artificially inflates the reach of EFT models due to the presence of tunable parameters.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.