Degraded Visual Environment (DVE) is experienced when helicopters enter Inadvertent Instrument Meteorological Conditions (I-IMC). DVE can occur in the form of fog, night flight occurring naturally or when pilots try to land in unprepared (dusty, snowy) landing zones causing brownouts and whiteouts from rotor downwash. The Degraded Visual Environment Navigation Support (DVENS) project aimed to use a LiDAR to scan a specified Field of View (FOV) and range to identify a zone to be safe or unsafe for landing in a simulation capacity. A Head Down Display (HDD) with touch capabilities was used to provide Virtual Visual Meteorological Conditions (V-VMC), in which 3D conformal, 2D orthographic symbology is displayed. For the iterative design of the symbology, to diagnose and minimize pilot error in DVE the Taxonomic Framework for Aircrew Error evaluation is used. The framework allowed for a more direct design approach with clear objectives based on operational requirements and presenting an optimal workload for pilots. Maintaining the objective of showing the required information to the pilots while minimizing clutter is imperative as too much information can increase workload. Thus, the aircrew error Taxonomic framework helps identify the design goals required to neutralize pilot error leading to an efficient design. The Ryerson Mixed Immersive Motion Simulation (MIMS) lab’s Fixed Base Simulator (FBS) and CAE, Presagis’s HELI CRAFT were used as the simulation testing tools. Non-intrusive questionnaires such as NASA Task Load Index (TLX), Bedford and Cooper-Harper display rating scales were used to provide feedback and evaluate the display system. Multiple scales were used for validation of results and to measure the workload, stress, physical, psychological and time loads. This methodology and design were found to be extremely helpful in assisting pilots to land, take off in DVE and IIMC conditions.