When interacting with the environment, humans exhibit robust biases toward information that pertains to themselves: Self-relevant information is processed faster and yields more accurate responses than information linked to others. Recent studies have shown that simple social associations can lead to the instant deployment of this benefit in the processing of abstract stimuli. However, how self-prioritization evolves across the processing hierarchy has been a subject of intense debate. Furthermore, there is little empirical evidence about the functional efficiency of social relevance in natural environments in which information is present across multiple senses. Across three experiments (each n = 40), the present study shows that self-prioritization effects (a) can arise in simple audio-visual numerosity judgements, (b) can be efficiently deployed across the senses by funneling perception toward self-relevant information in the more reliable sensory modality, and (c) modulate the integration of auditory and visual information into a multisensory representation. Taken together, the present findings suggest that social relevance can influence multisensory processing at both perceptual and postperceptual stages via early attentional modulations of sensory integration and later, task-dependent attentional control. Public Significance StatementThis study provides evidence that self-relevance of abstract information (temporal event numerosity) leads to changes in multisensory perception. Establishing self-relevance via social associations of oneself or another person with two visual event numerosities leads to a relative performance facilitation for self-related information that transfers to auditory and multisensory contexts. Furthermore, in relation to other-associated information, self-association with event numerosities modulates audio-visual integration by increasing sensitivity and decreasing bias in the fusion illusion. Together, the present findings suggest that social salience can influence multisensory processing via attentional modulations.
Human adults can optimally combine vision with self-motion to facilitate navigation. In the absence of visual input (e.g., dark environments and visual impairments), sensory substitution devices (SSDs), such as The vOICe or BrainPort, which translate visual information into auditory or tactile information, could be used to increase navigation precision when integrated together or with self-motion. In Experiment 1, we compared and assessed together The vOICe and BrainPort in aerial maps task performed by a group of sighted participants. In Experiment 2, we examined whether sighted individuals and a group of visually impaired (VI) individuals could benefit from using The vOICe, with and without self-motion, to accurately navigate a three-dimensional (3D) environment. In both studies, 3D motion tracking data were used to determine the level of precision with which participants performed two different tasks (an egocentric and an allocentric task) and three different conditions (two unisensory conditions and one multisensory condition). In Experiment 1, we found no benefit of using the devices together. In Experiment 2, the sighted performance during The vOICe was almost as good as that for self-motion despite a short training period, although we found no benefit (reduction in variability) of using The vOICe and self-motion in combination compared to the two in isolation. In contrast, the group of VI participants did benefit from combining The vOICe and self-motion despite the low number of trials. Finally, while both groups became more accurate in their use of The vOICe with increased trials, only the VI group showed an increased level of accuracy in the combined condition. Our findings highlight how exploiting non-visual multisensory integration to develop new assistive technologies could be key to help blind and VI persons, especially due to their difficulty in attaining allocentric information.
A stable self-representation has an intrinsically beneficial connotation for information processing: it allows the individual to flexibly adapt to different contexts, while prioritizing information that pertains to the own immediate survival. Indeed, many studies have shown how linking arbitrary information to physical or psychological aspects of the self leads to pervasive effects on our decision-making and even our perception. However, the evidence we have gained so far stems from isolated aspects of the self, and varying measures across studies and different levels of processing make results difficult to compare. The present study demonstrates that associating arbitrary information with the self rapidly leads to faster and more efficient processing of information, with stable performance benefits across different tasks (matching and categorization task) and stimulus domains. Focusing on specific processing levels, the findings first provide evidence regarding the involvement of self-relatedness in perception. Here, contrast processing interacted with self-relatedness, but only when complex stimuli were used. Second, they show that self-prioritization is flexible to decisional modulations, with processing benefits being adjusted to different social contexts. Third, the present data provides evidence that performance benefits toward newly self-associated, abstract information are equivalent to those resulting from long-term established self-associations with personally owned objects. The results highlight mechanistic differences between the prioritization of information linked to the self and information linked to close others. Overall, the present findings suggest that the self acts as a stable anchor in information processing, allowing us to filter information by its immediate relevance to facilitate optimal behavior.
Interest in the influence of musical training on executive functions (EFs) has been growing in recent years. However, the relationship between musical training and EFs remains unclear. By dividing EFs into inhibitory control, working memory, and cognitive flexibility, this study systematically examined its association with musical training in children, and further verified whether there was a sensitive period for the influence of music training on EFs. In Experiment 1, musically trained and untrained children were asked to complete the Go/No-go, Stroop, Continuous Performance, and Switching tasks. Results showed that musically trained children had an advantage in attention inhibition, response inhibition, and working memory, but not in cognitive flexibility. Moreover, the level of musical training was positively correlated with response inhibition and working memory abilities. In Experiment 2, results showed that early-trained musicians performed better on measures of attention inhibition, response inhibition, and working memory than did the age-matched control group, but late-trained musicians only performed better in attention inhibition. Thus, our findings suggest that music training is associated with enhanced EF abilities and provide the first evidence that early childhood is a sensitive period when musical training has a more powerful effect on the development of EFs.
Early sensory input is crucial for the development of perceptual processes. A key method to discover the importance of early sensory input for perceptual development is to compare those who have had a sense, such as vision, impaired at an early developmental age to those who acquire sensory deprivation later in life. For example, comparing humans who became blind early in life to
“What does it mean, to see?” was the question that David Marr used to motivate his computation approach to understanding Vision (Marr, 1982). Marr's answer, building on Aristotle, was that “vision is the process of discovering from images what is present in the world, and where it is” (p. 3). Although we humans might have a preference for visual perception, we are endowed with other senses that provide us with a rich experience (Chapter 14, this volume). Therefore the broader question might be: What does it mean, to perceive? Although this might be seen as a philosophical question of sorts, it gets to the important issue of how we define perceptual experience scientifically so that we may study it. The importance of defining it is crucial for research applications: If we aim to restore a sense such as vision in blindness or hearing in deafness, what does it mean to see or to hear such that we will know when restoration has been successful? This chapter reviews the interaction between multisensory perception and interactive technological approaches to sensory rehabilitation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.