2014
DOI: 10.1016/j.ress.2014.04.006
|View full text |Cite
|
Sign up to set email alerts
|

Planning structural inspection and maintenance policies via dynamic programming and Markov processes. Part II: POMDP implementation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
45
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 106 publications
(47 citation statements)
references
References 29 publications
0
45
0
Order By: Relevance
“…By following policy π*, the agent optimally explores and exploits in the POMDP, without the need of explicitly computing any VoI. Component-level inspection scheduling can be incorporated in the POMDP framework by including exploratory actions [2][3][4][5][6][7]. In the example reported in [6], action "visual inspection" is listed in the available options, together with action "repair".…”
Section: Information Gathering In Pomdpsmentioning
confidence: 99%
See 1 more Smart Citation
“…By following policy π*, the agent optimally explores and exploits in the POMDP, without the need of explicitly computing any VoI. Component-level inspection scheduling can be incorporated in the POMDP framework by including exploratory actions [2][3][4][5][6][7]. In the example reported in [6], action "visual inspection" is listed in the available options, together with action "repair".…”
Section: Information Gathering In Pomdpsmentioning
confidence: 99%
“…POMDPs include probabilistic models of degradation, cost and observation depending on the management decisions, and are rooted in Bayesian analysis. Both Markov Decision Processes (MDPs) and POMDPs have been extensively investigated for IM applications [2][3][4][5][6][7][8][9][10][11][12][13], due Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/ress to the computational efficiency of dynamic programming. A seminal introduction on VoI analysis is provided by the book of Raiffa and Schlaifer [14], while applications to management of structural and infrastructure systems is provided by Pozzi and Der Kiureghian [15], Straub [16], Zonta et al [17], and Malings and Pozzi [18]; most applications refer to a single-decision making problem, however assessment of the VoI in sequential decision making is also presented in [16,19,20].…”
Section: Introductionmentioning
confidence: 99%
“…Long-term O&M under uncertainty can be modeled as a POMDP. 16,37 In this framework, the condition of a component evolves stochastically in time, following a Markov process, depending on the maintenance actions, and costs depend on the current condition and maintenance action. The condition state is not necessarily completely observable.…”
Section: Background and Notation For Pomdp Frameworkmentioning
confidence: 99%
“…POMDPs provide a rich framework for planning under both state transition uncertainty and observation uncertainty. The POMDP model has been widely used for asset management under uncertainty; see [14] and the references therein. Note that POMDPs are not well suited for machine maintenance, as they are based on a discrete time step: the unitary action taken at time t affects the state and reward at time t + 1.…”
Section: Abbreviations and Acronymsmentioning
confidence: 99%