High Performance Computing in Science and Engineering, Garching/Munich 2007
DOI: 10.1007/978-3-540-69182-2_3
|View full text |Cite
|
Sign up to set email alerts
|

The ART of Cosmological Simulations

Abstract: We describe the basic ideas of MPI parallelization of the N-body Adaptive Refinement Tree (ART) code. The code uses self-adaptive domain decomposition where boundaries of the domains (parallelepipeds) constantly move -with many degrees of freedom -in the search of the minimum of CPU time. The actual CPU time spent by each MPI task on a previous time-step is used to adjust boundaries for the next time-step. For a typical decomposition of 5 3 domains, the number of possible changes in boundaries is 3 84 ≈ 10 40 … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 18 publications
(21 reference statements)
0
4
0
Order By: Relevance
“…The Bolshoi simulation has been performed using the Adaptive Refinement Tree (ART) code (Kravtsov et al 1997). The code was parallelized using MPI libraries and OpenMP directives (Gottlöber & Klypin 2009). The simulation is described in detail in ).…”
Section: Simulation Datamentioning
confidence: 99%
“…The Bolshoi simulation has been performed using the Adaptive Refinement Tree (ART) code (Kravtsov et al 1997). The code was parallelized using MPI libraries and OpenMP directives (Gottlöber & Klypin 2009). The simulation is described in detail in ).…”
Section: Simulation Datamentioning
confidence: 99%
“…The simulation we use is the "L1000W" presented in Tinker et al (2008Tinker et al ( , 2010, where the halo mass function and bias function are calibrated. It evolves 1024 3 particles with mp = 6.98 × 10 10 h −1 M⊙ in a periodic box of co-moving length 1000h −1 Mpc using the Adaptive Refinement Tree (ART; Kravtsov et al 1997;Gottloeber & Klypin 2008) code. For the mass scales we consider here, M > 5 × 10 13 h −1 M⊙, it has at least ∼ 1000 particles for each halo and is thus well-suited for the study of ξ hm and ∆Σ.…”
Section: Halo-mass Correlation Functionmentioning
confidence: 99%
“…As we want to study γ -rays from large structures in the Local Universe such as nearby galaxy clusters, we choose the Box160CR simulation. This is a constrained realization with 1024 3 particles in a cube of 160 h −1 Mpc on a side which was run using the MPI-ART cosmological code (Kravtsov et al 1997;Gottlöber & Klypin 2008). The initial conditions are set assuming WMAP3 cosmology (with Ω m = 0.24, Ω Λ = 0.76, Ω b = 0.042, h = 0.73, σ 8 = 0.75, and n = 0.95) and implement the constraints from the observed density field so that it reproduces the observed matter distribution in the local universe on large scales at redshift z = 0 (Hoffman & Ribak 1991;Klypin et al 2003).…”
Section: Constrained Simulations Of the Local Universementioning
confidence: 99%