“…Based on existing data about proprioceptive accuracy, which mostly relies on vision, developers integrate cross-modal perception for orientation, object identification, localization, and body motion (Sherman and Craig, 2018;Valori et al, 2020). Furthermore, the relationship between visual experiences and different brain regions associated to other senses, such as hearing or touch, plays a crucial role in interaction with XR (Raybourn et al, 2019;Sherman and Craig, 2018). As a result, all senses play a specific function in creating an illusion of presence in virtual and augmented environments.…”