The use of coordinate processes for the modelling of impulse control for general Markov processes typically involves the construction of a probability measure on a countable product of copies of the path space. In addition, admissibility of an impulse control policy requires that the random times of the interventions be stopping times with respect to different filtrations arising from the different component coordinate processes. When the underlying strong Markov process has continuous paths, however, a simpler model can be developed which takes the single path space as its probability space and uses the natural filtration with respect to which the intervention times must be stopping times. Moreover, this model construction allows for uncertain impulse control whereby the decision maker selects an impulse but the intervention may result in a different impulse occurring. This paper gives the construction of the probability measure on the path space for an admissible intervention policy subject to an uncertain impulse mechanism. An added feature is that when the intervention policy results in deterministic distributions for each impulse, the paths between interventions are independent and, moreover, if the same distribution is used for each impulse, then the cycles following the initial cycle are identically distributed. This paper also identifies a class of impulse policies under which the resulting controlled process is Markov. The decision to use an (s, S) ordering policy in inventory management provides an example of an impulse policy for which the process is Markov and has i.i.d. cycles so a benefit of the constructed model is that one is allowed to use classical renewal arguments.