Sensory perception is a product of interactions between the internal state of an organism and the physical attributes of a stimulus. It has been shown across the animal kingdom that perception and sensory-evoked physiological responses are modulated depending on whether or not the stimulus is the consequence of voluntary actions. These phenomena are often attributed to motor signals sent to relevant sensory regions that convey information about upcoming sensory consequences. However, the neurophysiological signature of action-locked modulations in sensory cortex, and their relationship with perception, is still unclear. In the current study, we recorded neurophysiological (using Magnetoencephalography) and behavioral responses from 16 healthy subjects performing an auditory detection task of faint tones. Tones were either generated by subjects’ voluntary button presses or occurred predictably following a visual cue. By introducing a constant temporal delay between button press/cue and tone delivery, and applying source-level analysis, we decoupled action-locked and auditory-locked activity in auditory cortex. We show action-locked evoked-responses in auditory cortex following sound-triggering actions and preceding sound onset. Such evoked-responses were not found for button-presses that were not coupled with sounds, or sounds delivered following a predictive visual cue. Our results provide evidence for efferent signals in human auditory cortex that are locked to voluntary actions coupled with future auditory consequences.
Voluntary actions are shaped by desired goals and internal intentions. Multiple factors, including the planning of subsequent actions and the expectation of sensory outcome, were shown to modulate kinetics and neural activity patterns associated with similar goal-directed actions. Notably, in many real-world tasks, actions can also vary across the semantic meaning they convey, although little is known about how semantic meaning modulates associated neurobehavioral measures. Here, we examined how behavioral and functional magnetic resonance imaging measures are modulated when subjects execute similar actions (button presses) for two different semantic meanings—to answer “yes” or “no” to a binary question. Our findings reveal that, when subjects answer using their right hand, the two semantic meanings are differentiated based on voxel patterns in the frontoparietal cortex and lateral-occipital complex bilaterally. When using their left hand, similar regions were found, albeit only with a more liberal threshold. Although subjects were faster to answer “yes” versus “no” when using their right hand, the neural differences cannot be explained by these kinetic differences. To the best of our knowledge, this is the first evidence showing that semantic meaning is embedded in the neural representation of actions, independent of alternative modulating factors such as kinetic and sensory features.
Accurate control over everyday goal-directed actions is mediated by sensory-motor predictions of intended consequences and their comparison with actual outcomes. Such online comparisons of the expected and re-afferent, immediate, sensory feedback are conceptualized as internal forward models. Current predictive coding theories describing such models typically address the processing of immediate sensory-motor goals, yet voluntary actions are also oriented towards long-term conceptual goals and intentions, for which the sensory consequence is sometimes absent or cannot be fully predicted. Thus, the neural mechanisms underlying actions with distal conceptual goals is far from being clear. Specifically, it is still unknown whether sensory-motor circuits also encode information regarding the global meaning of the action, detached from the immediate, movement-related goal. Therefore, using fMRI and behavioral measures, we examined identical actions (either right or left-hand button presses) performed for two different semantic intentions ('yes'/'no' response to questions regarding visual stimuli). Importantly, actions were devoid of differences in the immediate sensory outcome. Our findings revealed voxel patterns differentiating the two semantic goals in the frontoparietal cortex and visual pathways including the Lateral-occipital complex, in both hemispheres. Behavioral results suggest that the results cannot be explained by kinetic differences such as force. To the best of our knowledge, this is the first evidence showing that semantic meaning is embedded in the neural representation of actions independent of immediate sensory outcome and kinetic differences.
Sensory perception is a product of complex interactions between the internal state of an organism and the physical attributes of a stimulus. One factor that modulates the internal state of the perceiving agent is voluntary movement. It has been shown across the animal kingdom that perception and sensory-evoked physiological responses are modulated depending on whether or not the stimulus is the consequence of voluntary actions. These phenomena are often attributed to motor signals sent to relevant sensory regions (efference copies), that convey information about expected upcoming sensory consequences. However, to date, there is no direct evidence in humans for efferent signals underlying these motor-sensory interactions. In the current study we recorded neurophysiological (using Magnetoencephalography) and behavioral responses from 16 healthy subjects performing an auditory detection task of faint tones. Tones were either generated by subjects' voluntary button presses or occurred predictably following a visual cue.By introducing a constant temporal delay between button press/cue and tone delivery and applying source-level analysis we decoupled motor-evoked and auditory-evoked activity in auditory cortex. We show motor-related evoked-responses in auditory cortex following soundtriggering actions and preceding sound onset. Such evoked-responses were not found for buttonpresses that were not coupled with expected sounds. Furthermore, the amplitude of these evokedresponses corresponded with subsequent sound detection, suggesting their functional relevance to auditory processing. Our results provide first direct evidence for efferent signals in sensory cortex that are evoked by voluntary actions coupled with sensory consequences.
Evoked neural activity in sensory regions, and perception of sensory stimuli, are modulated when the stimuli are the consequence of voluntary movement as opposed to an external source. It has been suggested that such modulations are due to efference copies of the motor command that are sent to relevant sensory regions during voluntary movement. Given the anatomical-functional laterality bias of the motor system, it is plausible that the pattern of such behavioral and neural sensory modulations will exhibit a similar bias, depending on the effector that was used to trigger the stimulus (e.g. right / left hand). Here we examined this issue in the visual domain using behavioral and neural measures (fMRI). Healthy participants judged the relative brightness of identical visual stimuli that were either self-triggered (using right or left hand button presses), or triggered by the computer. By presenting stimuli to either the right or left visual field, we biased visual-evoked responses to left / right visual cortex.We found stronger perceptual modulations when the triggering hand was ipsi (rather than contra) lateral to the stimulated visual field. At the neural level, we found that despite identical physical properties of the visual consequence, evoked fMRI responses in right and left visual cortices differentiate the identity of the triggering hand (left / right). Our findings support a model in which voluntary actions induce sensory modulations that follow the anatomical-functional bias of the motor system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.