2012
DOI: 10.1016/j.cpc.2011.12.013
|View full text |Cite
|
Sign up to set email alerts
|

A massively parallel, multi-disciplinary Barnes–Hut tree code for extreme-scale N-body simulations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
53
0
1

Year Published

2013
2013
2016
2016

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 66 publications
(54 citation statements)
references
References 19 publications
0
53
0
1
Order By: Relevance
“…One possibility is to control the quality of the approximation of f using multipole methods instead of direct summation [26]. Thus, the use of fast summation algorithms not only allows extreme-scale simulations as demonstrated in [27], but also introduces a promising way of particle-based spatial "coarsening".…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…One possibility is to control the quality of the approximation of f using multipole methods instead of direct summation [26]. Thus, the use of fast summation algorithms not only allows extreme-scale simulations as demonstrated in [27], but also introduces a promising way of particle-based spatial "coarsening".…”
Section: Discussionmentioning
confidence: 99%
“…The unfavorable quadratic complexity can be overcome by computing approximate interactions using e. g. BarnesHut tree codes [1] or the Fast Multipole Method [11]. Results on the strong scaling of PFASST on extreme scales, simulating merely 4 million particles on up to 262,144 cores, are reported in [26], where the massively parallel Barnes-Hut tree code PEPC [9,10,23,24,27] is applied. There, however, only a very brief discussion of accuracy is given, aiming solely at identifying parameters that generate time-parallel and time-serial solutions of comparable quality that allow for a meaningful comparison in terms of runtimes.…”
Section: Introductionmentioning
confidence: 99%
“…PEPC has been successfully used for simulations of a variety of systems such as laser-plasma-interactions or gravitational problems [7,8,9] and can efficiently make use of modern many processor super computers [7,10].…”
Section: Introductionmentioning
confidence: 99%
“…On one hand, current HPC systems lack the computational power, network bandwidth and data storage needed for solving tomorrow's real-world engineering challenges. On the other hand, while emerging peta-scale computing is already a strategic enabler of large-scale simulations in many scientific areas (such as astronomy, biology and chemistry), even the most powerful hardware will fail to deliver on its full potential unless matched with simulation software designed specifically for such environments.Several papers describe the effort of performing large-scale simulations on supercomputers, covering key areas: molecular dynamics [26], mantle convection in solid earth dynamics [3], massive N-body simulations [36], seismic wave propagation [25], weather prediction [1] or fundamentals of turbulence on channels using the vortex method [37]. A similar list can be obtained from the 2014 ACM Gordon Bell Prize in…”
mentioning
confidence: 99%
“…Several papers describe the effort of performing large-scale simulations on supercomputers, covering key areas: molecular dynamics [26], mantle convection in solid earth dynamics [3], massive N-body simulations [36], seismic wave propagation [25], weather prediction [1] or fundamentals of turbulence on channels using the vortex method [37]. A similar list can be obtained from the 2014 ACM Gordon Bell Prize in High Performance Computing finalists.…”
mentioning
confidence: 99%