Uncertainty happens when people do not have enough information about a situation that compels them to act. The less correct information they have, the more their judgment will be based on beliefs and levels of trust. Conversely, the more they have correct information and knowledge, the more they will be certain of acting the right way. Uncertainty, ignorance, possibility, chance, and necessity are intimately related. Uncertainty is also related to situation awareness, which can be modeled as perception, comprehension, and projection. This is the reason people try to develop methods and tools to improve perception through various kinds of visualization techniques, comprehension through various kinds of reasoning techniques and tools (in the artificial intelligence sense), and projection through various kinds of abduction mechanisms (i.e., anticipate what will or could happen next). Accurate prediction can only refer, from a short-term perspective, to what happened before a situation is perceived (i.e., an event-driven or reactive causal approach). Conversely, longer term anticipation allows for guessing possible futures and testing them (i.e., a goal-driven or intentional approach). Claiming that uncertainty in systems engineering and complex operations is a matter of situation awareness, the proposed approach is based on a situational systemic framework, where complexity and flexibility are central factors to be considered to manage uncertainty in life-critical systems.