Abstract. We consider a stochastic variant of the single machine total weighted tardiness problem jobs parameters are independent random variables with normal or Erlang distributions. Since even deterministic problem is NP-hard, it is difficult to find global optimum for large instances in the reasonable run time. Therefore, we propose tabu search metaheuristics in this work. Computational experiments show that solutions obtained by the stochastic version of metaheuristics are more stable (i.e. resistant to data disturbance) than solutions generated by classic, deterministic version of the algorithm.Key words: scheduling, uncertain parameters, tabu search, stability. J = f1, 2, …, ng have to be processed without interruption on a single machine that can handle only one job at a time. All jobs become available for processing at the beginning (time zero). Each job i has an integer processing time p i , a due date d i and a positive weight w i . For a given sequence of jobs (earliest) due date C i , the tardiness T i = maxf0, C i ¡ d i g and the cost w i ¢T i of job i 2 J. The objective is to find a job sequence which minimizes the sum of the costs ∑ n i=1 w i ¢T i . This is a classical problem of scheduling theory. Literature reviewThe total weighted tardiness problem is NP-hard [15]. Enumerative algorithms (which use dynamic programming and branch and bound approaches) for the problem are described in [22,29]. The algorithms are a significant improvement over exhaustive search but they remain laborious and are applicable only to relatively small problems, with the number of jobs not exceeding 50 (80 in a multi-processor computer [29]). The enumerative algorithms mentioned above may require considerable computer resources both in terms of computation times and core storage. Therefore, many algorithms have been proposed to find near optimal schedules in reasonable time.Local search methods start from an initial solution and if repeatedly try to improve the current solution by local changes. The interchanges are continued until a solution that cannot be improved is obtained, which is a local minimum. To increase the performance of local search algorithms, metaheuristics like tabu search are used [2,7], simulated annealing [23], path relinking [4], genetic algorithms [7], ant colony optimization [9]. A very effective iterated local-search method has been proposed by Kirlik and Oguz [13]. The key aspect of the method is its ability to explore an exponential-size neighborhood in polynomial time by a dynamic programming technique.
We consider a strong NP-hard single-machine scheduling problem with deadlines and minimizing the total weight of late jobs on a single machine (1 w i U i ). Processing times are deterministic values or random variables having Erlang distributions. For this problem we study the tolerance to random parameter changes for solutions constructed according to tabu search metaheuristics. We also present a measure (called stability) that allows an evaluation of the algorithm based on its resistance to random parameter changes. Our experiments prove that random model solutions are more stable than the deterministic model ones.
Information hiding techniques have been recently getting increased attention from the security community. This is because attackers often apply a variety of data hiding methods to exfiltrate confidential information, enable covert transfers between the compromised victim's machine and an attacker-operated infrastructure, or stealthily transmit additional malicious tools. Furthermore, such data concealment can be realized using different types of carriers, for instance, digital images, video, audio, text, or network traffic. Therefore, by carefully inspecting various data hiding methods, it is possible to assess the implications that such threats cause and to evaluate the preparedness of the existing defensive solutions. Minification is a popular mean for source code size reduction while preserving its complete functionality. In effect, the data transfer can be realized in a more efficient manner. Considering the above, in this paper, we systematically evaluate if the minification process can be effectively used for secret data transfer. The performed extensive experimental evaluation and obtained results indicate that the threat is real, thus countermeasures need to be adjusted accordingly.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.