Affective computing has increased its significance both in terms of academic and industry attention and investment. Alongside, immersive digital environments have settled as a reliable domain, with progressively inexpensive hardware solutions. Having this in mind, the authors envisioned the automatic real-time user emotion extraction through biometric readings in an immersive digital environment. In the running example, the environment consisted in an aeronautical simulation, and biometric readings were based mainly on galvanic skin response, respiration rate and amplitude, and phalanx temperature. The assessed emotional states were also used to modify some simulation context variables, such as flight path, weather conditions and maneuver smoothness level. The results were consistent with the emotional states as stated by the users, achieving a success rate of 77%, considering single emotions and 86% considering a quadrant-based analysis.