2016
DOI: 10.1007/s10479-016-2251-z
|View full text |Cite
|
Sign up to set email alerts
|

Minimizing value-at-risk in single-machine scheduling

Abstract: The vast majority of the machine scheduling literature focuses on deterministic problems in which all data is known with certainty a priori. In practice, this assumption implies that the random parameters in the problem are represented by their point estimates in the scheduling model. The resulting schedules may perform well if the variability in the problem parameters is low. However, as variability increases accounting for this randomness explicitly in the model becomes crucial in order to counteract the ill… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
26
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 29 publications
(26 citation statements)
references
References 70 publications
0
26
0
Order By: Relevance
“…The authors proposes a Lagrangian relaxation-based scenario decomposition method considering random processing times and deterministic due dates. The approach in Atakan et al (2016) does not take into consideration release dates and adopt a scenario based formulation to deal with stochastic variables, differently from what the present work that explicitly considers the stochastic distributions of processing times and release dates.…”
Section: State Of the Artmentioning
confidence: 99%
See 1 more Smart Citation
“…The authors proposes a Lagrangian relaxation-based scenario decomposition method considering random processing times and deterministic due dates. The approach in Atakan et al (2016) does not take into consideration release dates and adopt a scenario based formulation to deal with stochastic variables, differently from what the present work that explicitly considers the stochastic distributions of processing times and release dates.…”
Section: State Of the Artmentioning
confidence: 99%
“…Atakan et al (2016) address a similar problem minimizing the VaR of a tardiness-related objective function, namely 6 the Total Tardiness and the Total Weighted Tardiness for a single machine. The authors proposes a Lagrangian relaxation-based scenario decomposition method considering random processing times and deterministic due dates.…”
Section: State Of the Artmentioning
confidence: 99%
“…A sample problem 1|| C j with 4 jobs and 5 processing time scenarios is shown in Figure 2. Let π = (1,2,3,4). It is easily seen that E[F(π)] = 26, VaR 0.5 [F(π)] = 29, CVaR 0.5 [F(π)] = 34 and Max[F(π)] = 36.…”
Section: Problem Formulationsmentioning
confidence: 99%
“…Robust decision-making formulations were presented by [9][10], and [11] focused on the scheduling issues in single-stage production environments, and defined a general E -robust scheduling objective when job processing times are independent random variables. Reference [12] imposed a probabilistic constraint on the random TWT(total weighted tardiness) and introduced a risk-averse stochastic programming model. In particular, the objective of the proposed model is to find a non-preemptive static job processing sequence that minimizes the value-at-risk (VAR) measure on the random TWT at a specified confidence level.…”
Section: Introductionmentioning
confidence: 99%
“…Both VaR theory and CVAR theory seek to guard against unfavorable realization of uncertain parameters by going beyond expectation evaluation when expressing the uncertainty of system parameters. Reference [12] considered the single machine scheduling problem by minimizing the VaR measure in the presence of uncertain problem parameters, they imposed a probabilistic constraint on the random TWT and introduced a risk-averse stochastic programming model. However, primarily due to the facts that VaR does not properly account for risk diversification and that it says nothing about the magnitude of losses beyond the quantile level in question, VaR, the class of coherent risk measures, was axiomatized and popularized by [15].…”
Section: Introductionmentioning
confidence: 99%