2002
DOI: 10.1017/s0956796802004343
|View full text |Cite
|
Sign up to set email alerts
|

Parallel and Distributed Haskells

Abstract: Parallel and distributed languages specify computations on multiple processors and have a computation language to describe the algorithm, i.e. what to compute, and a coordination language to describe how to organise the computations across the processors. Haskell has been used as the computation language for a wide variety of parallel and distributed languages, and this paper is a comprehensive survey of implemented languages. We outline parallel and distributed language concepts and classify Haskell exte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
24
0

Year Published

2004
2004
2014
2014

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 49 publications
(24 citation statements)
references
References 81 publications
0
24
0
Order By: Relevance
“…We focus mostly on those with a functional flavour. For surveys, see [28,41]. Broadly speaking, we can divide them into those that use some form of explicit message passing, and those that have more implicit mechanisms for distribution and communication.…”
Section: Related Workmentioning
confidence: 99%
“…We focus mostly on those with a functional flavour. For surveys, see [28,41]. Broadly speaking, we can divide them into those that use some form of explicit message passing, and those that have more implicit mechanisms for distribution and communication.…”
Section: Related Workmentioning
confidence: 99%
“…These two languages show major differences in their coordination concept and, in consequence, in their implementation, while on the other hand, they both capture the main idea of evaluating independent subexpressions in parallel. Another common point is that, in contrast to other parallel Haskells [2], both GpH and Eden are designed as general-purpose coordination languages. Neither of them is dedicated to a certain paradigm such as task-or data-parallelism or pure skeleton-based programming, though the respective coordination schemes can be expressed by both.…”
Section: General-purpose Parallelism In Haskellmentioning
confidence: 99%
“…Other dialects are more explicit in their handling of parallelism and allow what we call general-purpose parallelism, able to capture schemes of parallelism which are not data-oriented. Whereas machine-specific optimisation is easier with specialised structures and operations, these more general approaches present a considerable advantage in language design: It is generally accepted [1,2] that functional languages allow a clean distinction between a computation (or "base") language and independent coordination constructs for parallelism control. The more special-purpose data structures enter the language, the more vague this important distinction will become.…”
Section: Introductionmentioning
confidence: 99%
“…Typically, declarative implementations of skeletons are based on functional languages (like Eden [9], GpH [16], or PMLS [11]) that naturally represent skeletons as higher-order functions. These languages also allow to prove skeletons correctness [13].…”
mentioning
confidence: 99%