2011 Fourth International Symposium on Parallel Architectures, Algorithms and Programming 2011
DOI: 10.1109/paap.2011.33
|View full text |Cite
|
Sign up to set email alerts
|

Job Scheduling Optimization for Multi-user MapReduce Clusters

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
3
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 4 publications
0
3
0
Order By: Relevance
“…Sarood [34] proposed a model to distribute available nodes and power amongst the queued jobs such that the throughput of HPC data centers is maximized under a given power budget. Numerous job scheduling algorithms [3, 39,38,36] have been proposed with various optimization objectives including system efficiency and throughput. These approaches assume that once scheduled, all jobs run at 100% efficiency.…”
Section: Discussion and Related Workmentioning
confidence: 99%
“…Sarood [34] proposed a model to distribute available nodes and power amongst the queued jobs such that the throughput of HPC data centers is maximized under a given power budget. Numerous job scheduling algorithms [3, 39,38,36] have been proposed with various optimization objectives including system efficiency and throughput. These approaches assume that once scheduled, all jobs run at 100% efficiency.…”
Section: Discussion and Related Workmentioning
confidence: 99%
“…The first MapReduce schedulers operated on First-In-First-Out (FIFO) order. Later, Fair Scheduling [24] was introduced for using max-min fairness to share resources between pools of jobs or users.…”
Section: Mapreduce and Portfolio Schedulingmentioning
confidence: 99%
“…It manages all issues related to partitioning the input data, scheduling the program's execution and data transfers. Several research papers are focused on the MapReduce model to apply it to some business domains [10], [11] to resolve some algorithms issues [12], [13] or to search for some optimization leads [14], [15] (Fig. 4).…”
Section: B Mapreducementioning
confidence: 99%