Proprioceptive development relies on a variety of sensory inputs, among which vision is hugely dominant. Focusing on the developmental trajectory underpinning the integration of vision and proprioception, the present research explores how this integration is involved in interactions with Immersive Virtual Reality (IVR) by examining how proprioceptive accuracy is affected by Age, Perception, and Environment. Individuals from 4 to 43 years old completed a self-turning task which asked them to manually return to a previous location with different sensory modalities available in both IVR and reality. Results were interpreted from an exploratory perspective using Bayesian model comparison analysis, which allows the phenomena to be described using probabilistic statements rather than simplified reject/notreject decisions. The most plausible model showed that 4-8-year-old children can generally be expected to make more proprioceptive errors than older children and adults. Across age groups, proprioceptive accuracy is higher when vision is available, and is disrupted in the visual environment provided by the IVR headset. We can conclude that proprioceptive accuracy mostly develops during the first eight years of life and that it relies largely on vision. Moreover, our findings indicate that this proprioceptive accuracy can be disrupted by the use of an IVR headset.
When learning and interacting with the world, people with Autism Spectrum Disorders (ASD) show compromised use of vision and enhanced reliance on body-based information. As this atypical profile is associated with motor and social difficulties, interventions could aim to reduce the potentially isolating reliance on the body and foster the use of visual information. To this end, head-mounted displays (HMDs) have unique features that enable the design of Immersive Virtual Realities (IVR) for manipulating and training sensorimotor processing. The present study assesses feasibility and offers some early insights from a new paradigm for exploring how children and adults with ASD interact with Reality and IVR when vision and proprioception are manipulated. Seven participants (five adults, two children) performed a self-turn task in two environments (Reality and IVR) for each of three sensory conditions (Only Proprioception, Only Vision, Vision + Proprioception) in a purpose-designed testing room and an HMD-simulated environment. The pilot indicates good feasibility of the paradigm. Preliminary data visualisation suggests the importance of considering inter-individual variability. The participants in this study who performed worse with Only Vision and better with Only Proprioception seemed to benefit from the use of IVR. Those who performed better with Only Vision and worse with Only Proprioception seemed to benefit from Reality. Therefore, we invite researchers and clinicians to consider that IVR may facilitate or impair individuals depending on their profiles.
The present work explores the distinctive contribution of motor planning and control to human reaching movements. In particular, the movements were triggered by the selection of a prepotent response (Dominant) or, instead, by the inhibition of the prepotent response, which required the selection of an alternative one (Non-dominant). To this end, we adapted a Go/No-Go task to investigate both the dominant and non-dominant movements of a cohort of 19 adults, utilizing kinematic measures to discriminate between the planning and control components of the two actions. In this experiment, a low-cost, easy to use, 3-axis wrist-worn accelerometer was put to good use to obtain raw acceleration data and to compute and break down its velocity components. The values obtained with this task indicate that with the inhibition of a prepotent response, the selection and execution of the alternative one yields both a longer reaction time and movement duration. Moreover, the peak velocity occurred later in time in the non-dominant response with respect to the dominant response, revealing that participants tended to indulge more in motor planning than in adjusting their movement along the way. Finally, comparing such results to the findings obtained by other means in the literature, we discuss the feasibility of an accelerometer-based analysis to disentangle distinctive cognitive mechanisms of human movements.
Proprioceptive development relies on a variety of sensory inputs, among which vision is hugely dominant. Focusing on the developmental trajectory underpinning the integration of vision and proprioception, the present research explores how this integration is involved in interactions with Immersive Virtual Reality (IVR) by examining how proprioceptive accuracy is affected by age, perception, and environment.Individuals from 4 to 43 years old completed a self-turning task which asked them to manually return to a previous location with different sensory modalities available in both IVR and reality. Results were interpreted from an exploratory perspective using Bayesian model comparison analysis, which allows the phenomena to be described using probabilistic statements rather than simplified reject/not-reject decisions. The most plausible model showed that 4-8-year-old children can generally be expected to make more proprioceptive errors than older children and adults. Across age groups, proprioceptive accuracy is higher when vision is available, and is disrupted in the visual environment provided by the IVR headset. We can conclude that proprioceptive accuracy mostly develops during the first eight years of life and that it relies largely on vision. Moreover, our findings indicate that this proprioceptive accuracy can be disrupted by the use of an IVR headset.From the intrauterine life, our physical, psychological, and social development 2 progresses thanks to the interaction between our genetic profile and the environment. 3 Sensory information from the both external world (exteroception) and the self 4 (interoception) is detected by our emerging sensory functions. We talk about 5 exteroception when the sensory information comes from the environment around us (e.g. 6 sight, hearing, touch), while interoception is the perception of our body and includes 7 "temperature, pain, itch, tickle, sensual touch, muscular and visceral sensations, 8 vasomotor flush, hunger, thirst" (p. 655 [1]). This information, which comes from 9 different complementary sensory modalities, has to be integrated so that we can interact 10 with and learn from the environment. The multisensory integration that follows takes 11 August 16, 2019 2/38 time to develop and emerges in a heterochronous pattern: we rely on the various 12 sensory modalities to different degrees at different points in the human developmental 13 trajectory, during which the sensory modalities interact in different ways [2]. In general, 14 our sensory development is driven by crossmodal calibration: one accurate sensory 15 modality can improve performance based on information delivered by another, less 16 accurate, sensory modality [3-5]. 17 Proprioception: an emergent perception arising from a 18 multisensory process 19 Both exteroception and interoception drive our discovery of the external world and the 20 self. One important physical dimension of the concept of self is proprioception, whose 21 definition is particularly complex and debated in the extant literature. Propr...
Humans are by nature social beings tuned to communicate and interact from the very beginning of their lives. The sense of touch represents the most direct and intimate channel of communication and a powerful means of connection between the self and the others. In our digital age, the development and diffusion of internet-based technologies and virtual environments offer new opportunities of communication overcoming physical distance. It however, happens that social interactions are often mediated, and the tactile aspects of communication are overlooked, thus diminishing the feeling of social presence, which may contribute to an increased sense of social disconnection and loneliness. The current manuscript aims to review the extant literature about the socio-affective dimension of touch and current advancements in interactive virtual environments in order to provide a new perspective on multisensory virtual communication. Specifically, we suggest that interpersonal affective touch might critically impact virtual social exchanges, promoting a sense of co-presence and social connection between individuals, possibly overcoming feelings of sensory loneliness. This topic of investigation will be of crucial relevance from a theoretical perspective aiming to understand how we integrate multisensory signals in processing and making sense of interpersonal exchanges, this is important in both typical and atypical populations. Moreover, it will pave the way to promising applications by exploring the possibility to use technical innovations to communicate more interactively in the case of people who suffer from social isolation and disconnection from others.
The present work explores the distinctive contribution of motor planning and control to human reaching movements. In particular, the movements were triggered by the selection of a prepotent response (Dominant) or, instead, by the inhibition of the prepotent response, that required the selection of an alternative one (Non-dominant). To this aim, we adapted a Go/No-Go task to investigate both the dominant and non-dominant movements of a cohort of 19 adults, utilizing kinematic measures to discriminate between the planning and control components of the two actions. To sample such measures, a low-cost, easy to use, 3-axis wrist worn accelerometer was put to good use to obtain raw acceleration data and to compute and break down its velocity components. The values obtained with such task indicate that with the inhibition of a prepotent response, the selection and execution of the alternative one yields both a longer reaction time and movement duration. Moreover, the peak velocity occurred later in time with respect to the dominant response, revealing that participants tended to indulge more in motor planning rather than in adjusting their movement along the way. Finally, comparing such results to the findings obtained by other means in literature, we discuss the feasibility of an accelerometer-based analysis to disentangle distinctive cognitive mechanisms of human movements.
To flexibly regulate their behavior, children’s ability to inhibit prepotent responses arises from cognitive and motor mechanisms that have an intertwined developmental trajectory. Subtle differences in planning and control can contribute to impulsive behaviors, which are common in Attention Deficit and Hyperactivity Disorder (ADHD) and difficult to be assessed and trained. We adapted a Go/No-Go task and employed a portable, low-cost kinematic sensor to explore the different strategies used by children with ADHD or typical development to provide a prepotent response (dominant condition) or inhibit the prepotent and select an alternative one (non-dominant condition). Although no group difference emerged on accuracy levels, the kinematic analysis of correct responses revealed that, unlike neurotypical children, those with ADHD did not show increased motor planning in non-dominant compared to dominant trials. Future studies should investigate whether motor control could help children with ADHD compensate for planning difficulties. This strategy might make inhibition harder in naturalistic situations that involve complex actions. Combining cognitive and kinematic measures is a potential innovative method for assessment and intervention of subtle differences in executive processes such as inhibition, going deeper than is possible based on accuracy outcomes alone.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.