2013 International Conference on High Performance Computing &Amp; Simulation (HPCS) 2013
DOI: 10.1109/hpcsim.2013.6641463
|View full text |Cite
|
Sign up to set email alerts
|

Proper parallel Monte Carlo for computed tomography of volcanoes

Abstract: Most Monte Carlo simulations can be parallelized, or at least easily distributed. However, the parallelization of such programs is not mainstream. In this paper, we propose good practices to distribute a Monte Carlo simulation for computed tomography. We apply this to large edifices (such as volcanoes or pyramids) and give a particular focus on performances. The work included first the optimization of an existing sequential prototype and second the use of reliable parallel random streams. The optimized paralle… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2014
2014
2015
2015

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…Researchers have highlighted many problems in various application frameworks, particularly in nuclear medicine simulations, which often require more than 10 20 pseudorandom numbers deployed on thousands of processors. [2][3][4][5][6][7] When considering stochastic simulations, pseudorandom numbers must be generated in parallel, so that each computing element can autonomously obtain its own stream of random numbers independently of the other computing elements. If such independence isn't guaranteed, parallelism is affected and the independence of parallel streams is questioned, with potentially serious consequences because the quality of the final simulation results could be heavily damaged.…”
Section: Parallel Stochastic Simulations and Reproducibilitymentioning
confidence: 99%
See 1 more Smart Citation
“…Researchers have highlighted many problems in various application frameworks, particularly in nuclear medicine simulations, which often require more than 10 20 pseudorandom numbers deployed on thousands of processors. [2][3][4][5][6][7] When considering stochastic simulations, pseudorandom numbers must be generated in parallel, so that each computing element can autonomously obtain its own stream of random numbers independently of the other computing elements. If such independence isn't guaranteed, parallelism is affected and the independence of parallel streams is questioned, with potentially serious consequences because the quality of the final simulation results could be heavily damaged.…”
Section: Parallel Stochastic Simulations and Reproducibilitymentioning
confidence: 99%
“…Researchers have highlighted many problems in various application frameworks, particularly in nuclear medicine simulations, which often require more than 10 20 pseudorandom numbers deployed on thousands of processors. [2][3][4][5][6][7] When considering stochastic simulations, pseudorandom numbers must be generated in parallel, so that each …”
mentioning
confidence: 99%