Individuals in pain are motivated to be cooperative in social interaction. Yet, there has been little research on how pain dynamically affects cooperation at a neural level. The present study investigated the cooperative behavior under acute physical pain by asking dyads to complete three blocks of button-press cooperative task, while neural activities were recorded simultaneously on each subject by the fNIRS-based hyperscanning. Results showed that individuals in pain improved their cooperation rate across task blocks.Accordingly, increased interpersonal neural synchronization (INS) was found at the left prefrontal cortex in second block, whereas increased INS was found at the right prefrontal cortex and the right parietal cortex in third block compared to the first block. Moreover, the change of INS in right parietal cortex was positively correlated with subjective pain rating in the pain treatment group. In addition, dynamic interpersonal neural networks were identified in painful condition with increasing frontoparietal networks across time. By uncovering dissociative neural processes involved in how pain affects cooperation in social interaction, the present work provides the first interbrain evidence to highlight the sociality of pain on social interaction in perspective of motivational aspect of pain. K E Y W O R D S cooperation, functional near-infrared spectroscopy, hyperscanning, interpersonal neural synchronization, pain
When moving around in the world, the human visual system uses both motion and form information to estimate the direction of self-motion (i.e., heading). However, little is known about cortical areas in charge of this task. This brain-imaging study addressed this question by using visual stimuli consisting of randomly distributed dot pairs oriented toward a locus on a screen (the form-defined focus of expansion [FoE]) but moved away from a different locus (the motion-defined FoE) to simulate observer translation. We first fixed the motion-defined FoE location and shifted the form-defined FoE location. We then made the locations of the motion-and the form-defined FoEs either congruent (at the same location in the display) or incongruent (on the opposite sides of the display). The motion-or the form-defined FoE shift was the same in the two types of stimuli, but the perceived heading direction shifted for the congruent, but not for the incongruent stimuli. Participants (both sexes) made a task-irrelevant (contrast discrimination) judgment during scanning. Searchlight and ROI-based multivoxel pattern analysis revealed that early visual areas V1, V2, and V3 responded to either the motion-or the form-defined FoE shift. After V3, only the dorsal areas V3a and V3B/KO responded to such shifts. Furthermore, area V3B/KO shows a significantly higher decoding accuracy for the congruent than the incongruent stimuli. Our results provide direct evidence showing that area V3B/KO does not simply respond to motion and form cues but integrates these two cues for the perception of heading.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.