2012
DOI: 10.1007/978-3-642-27440-4_27
|View full text |Cite
|
Sign up to set email alerts
|

Parallel Quasi-Monte Carlo Integration by Partitioning Low Discrepancy Sequences

Abstract: A general concept for parallelizing quasi-Monte Carlo methods is introduced. By considering the distribution of computing jobs across a multiprocessor as an additional problem dimension, the straightforward application of quasiMonte Carlo methods implies parallelization. The approach in fact partitions a single low-discrepancy sequence into multiple low-discrepancy sequences. This allows for adaptive parallel processing without synchronization, i.e. communication is required only once for the final reduction o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 20 publications
(43 reference statements)
0
3
0
Order By: Relevance
“…A second option is to dedicate one dimension of the Sobol' sequence to determine whether the weights of a path shall be non-negative or non-positive just by checking whether the component is smaller than 1 2 or not. More details on partitioning one low discrepancy sequence into many are found in [KG12]. If the number of paths is a power of 2, partitioning a network generated by the Sobol' sequence into supporting and inhibiting network as described will result in a zero sum of weights per neuron if neurons in a layer have constant valence.…”
Section: Sampling Quasi-random Pathsmentioning
confidence: 99%
“…A second option is to dedicate one dimension of the Sobol' sequence to determine whether the weights of a path shall be non-negative or non-positive just by checking whether the component is smaller than 1 2 or not. More details on partitioning one low discrepancy sequence into many are found in [KG12]. If the number of paths is a power of 2, partitioning a network generated by the Sobol' sequence into supporting and inhibiting network as described will result in a zero sum of weights per neuron if neurons in a layer have constant valence.…”
Section: Sampling Quasi-random Pathsmentioning
confidence: 99%
“…One approach to parallelization is to partition one low discrepancy sequence into multiple low discrepancy sequences and to assign each one subsequence to a processing element [25]. To do so, a low discrepancy sequence is extended by one dimension.…”
Section: Partitioning Low Discrepancy Sequencesmentioning
confidence: 99%
“…All points of the low discrepancy sequence whose additional dimension is element of the 𝑝-th interval are assigned to the 𝑝-th processing element. For low discrepancy sequences based on radical inversion, the subsequences can be efficiently enumerated per processing element [25].…”
Section: Partitioning Low Discrepancy Sequencesmentioning
confidence: 99%