We present a Markov Decision Process (MDP) framework for computing post-fault reconfiguration policies that are optimal with respect to a discounted cost. Our cost function penalizes states that are unsuitable to achieve the remaining objectives of the given mission. The cost function also penalizes states where the necessary goal achievement actions cannot be executed. We incorporate probabilities of missed detections and false alarms for a given fault condition into our cost to encourage the selection of policies that minimize the likelihood of incorrect reconfiguration. To illustrate the implementation of our proposed framework, we present an example inspired by the Far Ultraviolet Spectroscopic Explorer (FUSE) spacecraft with a mission to collect scientific data from 5 targets. Using this example, we also demonstrate that there is a design tradeoff between safe operation and mission completion. Simulation results are presented to illustrate and manage this tradeoff through the selection of optimization parameters.