Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2008
2008
2012
2012

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 1 publication
0
7
0
Order By: Relevance
“…It has successfully applied generic programming techniques through abstract data structures in order to deploy parallel algorithms 171, 172. Subsequently, template extensions to the MPI 173 and OpenMP 174 standards have been proposed in order to use built‐in lexical abstractions for generic parallel programming, and diverse template libraries now support multicore environments 175 and self‐scheduling task parallelism 176.…”
Section: Related Approachesmentioning
confidence: 99%
“…It has successfully applied generic programming techniques through abstract data structures in order to deploy parallel algorithms 171, 172. Subsequently, template extensions to the MPI 173 and OpenMP 174 standards have been proposed in order to use built‐in lexical abstractions for generic parallel programming, and diverse template libraries now support multicore environments 175 and self‐scheduling task parallelism 176.…”
Section: Related Approachesmentioning
confidence: 99%
“…Boost.MPI supports automatic serialization of objects, and has served as the model for our work on MPI.NET. In particular, the high-level interfaces in MPI.NET and their resulting use of the low-level MPI library were ported directly from Boost.MPI [7] and its predecessor [10]. However, the generic programming techniques used in Boost.MPI could not be duplicated within C#, forcing MPI.NET to use completely different implementation techniques to achieve suitable performance.…”
Section: Related Workmentioning
confidence: 99%
“…More recently, Boost.MPI [7,10] has been developed, providing direct support for the generic programming paradigm. Boost.MPI supports automatic serialization of objects, and has served as the model for our work on MPI.NET.…”
Section: Related Workmentioning
confidence: 99%
“…The automatic tools that we consider closely related work include AutoMap/AutoLink [11], C++2MPI [14], and MPI Pre-Processor [28]. In addition, Boost.MPI and Boost.Serialization [17] provide library support for seamless marshaling and unmarshaling. Next we describe and compare the features of each of these approaches to motivate our approach and demonstrate how it improves on existing state-of-the-art.…”
Section: Directly Related Workmentioning
confidence: 99%
“…An MPI Datatype is a list of memory offsets describing the data to be marshaled given the base offset of a structure. However, the approach makes an implicit assumption that all memory in a structure has been allocated [17] aim at modernizing the C++ interface to MPI by utilizing advanced generic programming techniques [8]. These libraries provide support for automatic marshaling of primitive data types, user-defined classes, and STL container classes.…”
Section: Directly Related Workmentioning
confidence: 99%