Discovering the mass of neutrinos is a principle goal in high energy physics and cosmology. In addition to cosmological measurements based on two-point statistics, the neutrino mass can also be estimated by observations of neutrino wakes resulting from the relative motion between dark matter and neutrinos. Such a detection relies on an accurate reconstruction of the dark matter-neutrino relative velocity which is affected by non-linear structure growth and galaxy bias. We investigate our ability to reconstruct this relative velocity using large N-body simulations where we evolve neutrinos as distinct particles alongside the dark matter. We find that the dark matter velocity power spectrum is overpredicted by linear theory whereas the neutrino velocity power spectrum is underpredicted. The magnitude of the relative velocity observed in the simulations is found to be lower than what is predicted in linear theory. Since neither the dark matter nor the neutrino velocity fields are directly observable from galaxy or 21 cm surveys, we test the accuracy of a reconstruction algorithm based on halo density fields and linear theory. Assuming prior knowledge of the halo bias, we find that the reconstructed relative velocities are highly correlated with the simulated ones with correlation coefficients of 0.94, 0.93, 0.91 and 0.88 for neutrinos of mass 0.05, 0.1, 0.2 and 0.4 eV. We confirm that the relative velocity field reconstructed from large scale structure observations such as galaxy or 21 cm surveys can be accurate in direction and, with appropriate scaling, magnitude.
Constraining neutrino mass remains an elusive challenge in modern physics. Precision measurements are expected from several upcoming cosmological probes of large-scale structure. Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering. Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process. We incorporate neutrinos into the cosmological N-body code CUBEP 3 M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem. We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method of data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes. We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run, named TianNu, which uses 86% of the machine (13,824 compute nodes). With a total of 2.97 trillion particles, TianNu is currently the world's largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in arXiv:1611.01545v2 [astro-ph.CO] 28 Jul 2017 2 Emberson et al.scale. We finish with a discussion of the unanticipated computational challenges that were encountered during the TianNu runtime.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.