2008
DOI: 10.1080/00207540600993360
|View full text |Cite
|
Sign up to set email alerts
|

Learning effective dispatching rules for batch processor scheduling

Abstract: Batch processor scheduling, where machines can process multiple jobs simultaneously, is frequently harder than its unit-capacity counterpart because an effective scheduling procedure must not only decide how to group the individual jobs into batches, but also determine the sequence in which the batches are to be processed. We extend a previously developed genetic learning approach to automatically discover effective dispatching policies for several batch scheduling environments, and show that these rules yield… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
32
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 68 publications
(33 citation statements)
references
References 33 publications
1
32
0
Order By: Relevance
“…utilisation level). Although some special cases are considered in the literature such as batching [37,118], machine breakdowns [142], and unrelated parallel machines [31], these are very limited. In addition, most scheduling problems handled by GP are dynamic problems where jobs will arrive randomly over time and their information is only available upon their arrivals.…”
Section: Production Scheduling Problemsmentioning
confidence: 99%
See 1 more Smart Citation
“…utilisation level). Although some special cases are considered in the literature such as batching [37,118], machine breakdowns [142], and unrelated parallel machines [31], these are very limited. In addition, most scheduling problems handled by GP are dynamic problems where jobs will arrive randomly over time and their information is only available upon their arrivals.…”
Section: Production Scheduling Problemsmentioning
confidence: 99%
“…Yin et al [142] Ho and Tay [50] Geiger et al [38] Jakobovic and Budin [59] Jakobovic et al [60] Tay and Ho [134] Beham et al [14] Geiger and Uzsoy [37] Baykasoglu [9] Li et al [73] Tay and Ho [135] Yang et al [141] Mucientes et al [83] Baykasolu and Gken [12] Kofler et al [66] Furuholmen et al [36] Hildebrandt et al [46] Kuczapski et al…”
mentioning
confidence: 99%
“…Geiger, Uzsoy, and Aytuǧ (2006) compare a GP hyper-heuristic approach to wellestablished dispatching rules for a variety of single machine problems. In another paper Geiger and Uzsoy (2008), also apply a GP hyper-heuristic to a single machine batch processor model, whereby the scheduling procedure must perform two steps, both grouping the waiting jobs into batches and ordering the batches for processing. Yin, Liu, and Wu (2003), also apply GP to a single machine scheduling problem, but include stochastic machine breakdowns.…”
Section: Related Workmentioning
confidence: 99%
“…Parametric representation [21], [38], [42], [46] [35], [36], [41], [45], [47], [52], [57], [58], [59], [67] Grammar-based representation [24], [30], [39] [22], [23], [25], [26], [27], [28], [29], [31], [32], [33], [34], [37], [40], [43], [44], [48], [49], [50], [51], [52], [53], [54], [55], [56], [60], [61], [62], [63], [64], [65], [66] The next section (III-A) discusses the two types of learning methods used within hyper-heuristics, followed by a discussion of the selection of attributes to be provided to the hyperheuristic in Section III-B. The different representations of priority functions are presented in Section III-C together with suitable optimisation algorithms as they are closely tied to the chosen representation.…”
Section: Supervised Learningmentioning
confidence: 99%