2009
DOI: 10.1177/1094342009106195
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Task-Based Programming With StarSs

Abstract: Programming models for multicore and many-core systems are listed as one of the main challenges in the near future for computing research. These programming models should be able to exploit the underlying platform, but also should have good programmability to enable programmer productivity. With respect to the heterogeneity and hierarchy of the underlying platforms, the programming models should take them into account but they should also enable the programmer to be unaware of the complexity of the hardware. I… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
109
0

Year Published

2010
2010
2018
2018

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 142 publications
(110 citation statements)
references
References 18 publications
(22 reference statements)
1
109
0
Order By: Relevance
“…Recently, in order to cope with resource heterogeneity and enable performance portability, the use of dynamic runtime schedulers have been proposed, such as StarPU [1], StarSs [2], QUARK [17] or PaRSEC [3]. Applications are described as a set of tasks, whose dependencies can be automatically deduced from access to shared data with the STF model [1], or explicitly specified [3].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, in order to cope with resource heterogeneity and enable performance portability, the use of dynamic runtime schedulers have been proposed, such as StarPU [1], StarSs [2], QUARK [17] or PaRSEC [3]. Applications are described as a set of tasks, whose dependencies can be automatically deduced from access to shared data with the STF model [1], or explicitly specified [3].…”
Section: Related Workmentioning
confidence: 99%
“…Adapting these distributions to heterogeneous settings is actually a hard algorithmic and technical challenge. However, to cope with the increasing heterogeneity of the architectures, task-based runtime systems (such as StarPU [1], StarSs [2], ParSEC [3], and others) are currently proposed, with the goal of enabling better performance portability for applications across different architectures. To this end, such runtime systems provide an efficient separation of the numerical computation from the associated scheduling and resource allocation decisions.…”
Section: Introductionmentioning
confidence: 99%
“…StarSs (Star-superscalar) [PBAL09] is afamily of languages and runtime systems implemented for different kinds of parallel target platforms, such as CellSs, GPUSs, OMPSs, ClusterSs. Similarly as StarPU, the StarSs model extends sequential computing by discovering and scheduling data-ready sequential tasks, which are defined by invocations of specific user functions, at run-time to some available execution unit, such as an idle CPU core, aG PU or aC ell SPU.…”
Section: R Elated Workmentioning
confidence: 99%
“…The runtime system then organizes execution of all tasks respecting the constraints, avoiding the need for the programmer to reason (and potentially make difficult-to-find mistakes) about the circumstances when the dependencies are fulfilled. StarSs [4] and OmpSs [5] are good examples of such a paradigm. The runtime system of those programming models builds, at runtime, a task graph based on the sequence of function calls and their input/output requirements, and determines which tasks are ready to run.…”
Section: Introductionmentioning
confidence: 99%
“…To obtain speedups from these new architectures, applications need to use parallel algorithms, which are more challenging to develop than their sequential alternatives. Many programming models have been proposed to ease parallel programming, such as Google's MapReduce [1], Intel's TBB [2], OpenMP [3], StarSs [4] and OmpSs [5]. All parallel programming models share the goal of decoupling the programmer from the underlying multicore machine, but they differ from one another in the degree of abstraction.…”
Section: Introductionmentioning
confidence: 99%