Work centre-based decomposition approaches, especially variants of the Shifting Bottleneck algorithm, have been very successful in solving job-shop-scheduling problems. These methods decompose the problem into subproblems involving a single work centre (usually a single machine), which they solve sequentially. We propose new measures of subproblem criticality and show via computational experiments that several of these provide solutions comparable in quality with those obtained from previous work in substantially less central processing unit time.
IntroductionEffective factory scheduling, which involves the allocation of machines to competing jobs over time, is critical to success in today's highly competitive global industries. While many industrial problems are significantly more complex, the job-shop-scheduling problem has served as a vehicle for many researchers to develop and test new scheduling procedures. In this problem, there are n jobs to be processed on m machines. The routings of the jobs are deterministic and known a priori, as are the processing times of each job on each machine. Each machine can process only one job at a time. We consider here the problems of minimizing makespan (C max ) and maximum lateness (L max ) in this environment. Both these problems, denoted by J//C max and J//L max , respectively, in the notation of Lawler et al. (1993), have been shown to be NP-hard in the strong sense (Garey and Johnson 1979). Hence, some researchers have focused on developing exact procedures based on branch-andbound algorithms (e.g. Carlier and Pinson 1989, Brucker et al. 1994). These methods produce exact solutions, but their worst-case computational burden increases exponentially with the size of the problem instance. In order to be able to apply optimization-based algorithms to industrial problems, the computational time of any algorithm must be short enough that the schedule is generated in a time frame short enough for the resulting schedule to be usable. In semiconductor manufacturing environments where several hundred jobs must be scheduled on several hundred pieces of equipment, this means that algorithms giving high solution quality usually require quite high central processing unit (CPU) times, and hence any reduction in CPU time represents a significant advantage in terms of the practical applicability of the algorithm. A reduction of the order of 10-15% may well represent an absolute