Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2009
DOI: 10.1016/j.epsr.2009.03.008
|View full text |Cite
|
Sign up to set email alerts
|

Optimum maintenance policy using semi-Markov decision processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
15
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 37 publications
(16 citation statements)
references
References 4 publications
0
15
0
Order By: Relevance
“…B above. The detail is similar with calculation of two and one maintenance models discussed in [12].…”
Section: Optimal Policy Determinationmentioning
confidence: 91%
See 3 more Smart Citations
“…B above. The detail is similar with calculation of two and one maintenance models discussed in [12].…”
Section: Optimal Policy Determinationmentioning
confidence: 91%
“…After inspection, engineers need to determine whether minor or major maintenance should be performed or not. The procedure of modeling is similar with the modeling in paper [12] [17], with policy improvement method.…”
Section: B Including Inspection Into Smdp Modelingmentioning
confidence: 99%
See 2 more Smart Citations
“…An example is presented to illustrate the implementation of proposed method. A semi-Markov decision process (SMDP) is utilized in [37] to determine whether maintenance should be performed in each deterioration state and, if so, what type of maintenance to perform for repairable power equipment. In [38], road deterioration is modeled as a semi-Markov process in which the state transition has the Markov property and the holding time in each state is assumed to follow a discrete Weibull distribution.…”
Section: Literature Reviewmentioning
confidence: 99%