Three macaque monkeys and 13 healthy human volunteers underwent diffusion tensor MRI with a 3 Tesla scanner for diffusion tract tracing (DTT) reconstruction of callosal bundles from different areas. In six macaque monkeys and three human subjects, the length of fiber tracts was obtained from histological data and combined with information on the distribution of axon diameter, so as to estimate callosal conduction delays from different areas. The results showed that in monkeys, the spectrum of tract lengths obtained with DTT closely matches that estimated from histological reconstruction of axons labeled with an anterogradely transported tracer. For each sector of the callosum, we obtained very similar conduction delays regardless of whether conduction distance was obtained from tractography or from histological analysis of labeled axons. This direct validation of DTT measurements by histological methods in monkeys was a prerequisite for the computation of the callosal conduction distances and delays in humans, which we had previously obtained by extrapolating the length of callosal axons from that of the monkey, proportionally to the brain volumes in the two species. For this analysis, we used the distribution of axon diameters from four different sectors of the corpus callosum. As in monkeys, in humans the shortest callosal conduction delays were those of motor, somatosensory, and premotor areas; the longer ones were those of temporal, parietal, and visual areas. These results provide the first histological validation of anatomical data about connection length in the primate brain based on DTT imaging.
Background: The introduction of the diagnosis of complex posttraumatic stress disorder (CPTSD) by ICD-11 is a turning point in the field of traumatic stress studies. It's therefore important to examine the validity of CPTSD in refugee groups exposed to complex trauma (CT) defined as a repeated, prolonged, interpersonal traumatic event. Objective: The objective of this study was to compare DSM-5 and ICD-11 post-traumatic stress disorder diagnoses and to evaluate the discriminant validity of ICD-11 PTSD and CPTSD constructs in a sample of treatment-seeking refugees living in Italy. Method: The study sample included 120 treatment-seeking African refugees living in Italy. All participants were survivors of at least one CT. PTSD and CPTSD diagnoses were assessed according to both DSM-5 and ICD-11 criteria. Results: Findings revealed that 79% of the participants met the DSM-5 criteria for PTSD, 38% for ICD-11 PTSD and 30% for ICD-11 CPTSD. Generally, ICD-11 CPTSD items evidenced strong sensitivity and negative predictive power, low specificity and positive predictive power. Latent class analysis results identified two distinct groups: (1) a PTSD class, (2) a CPTSD class. None of the demographic and trauma-related variables analysed was significantly associated with diagnostic group. On the other hand, the months spent in Italy were significantly associated with PCL-5 score. Conclusions: Findings extend the current evidence base to support the discriminant validity of PTSD and CPTSD amongst refugees exposed to torture and other gross violations of human rights. The results suggest also that, in the post-traumatic phase, the time spent in a 'safe place' condition contributes to improve the severity of post-traumatic symptomatology, but neither this variable nor other socio-demographic factors seem to contribute to the emergence of complex PTSD. Further investigations are needed to clarify which specific vulnerability factors influence the development of PTSD or CPTSD in refugees exposed to complex trauma.
Background and aimParkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants.MethodsTwenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response.ResultsFor emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004).ConclusionsPD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives.
Daily life often requires the coordination of our actions with those of another partner. After 50 years (1968 -2018) of behavioral neurophysiology of motor control, the neural mechanisms that allow such coordination in primates are unknown. We studied this issue by recording cell activity simultaneously from dorsal premotor cortex (PMd) of two male interacting monkeys trained to coordinate their hand forces to achieve a common goal. We found a population of "joint-action cells" that discharged preferentially when monkeys cooperated in the task. This modulation was predictive in nature, because in most cells neural activity led in time the changes of the "own" and of the "other" behavior. These neurons encoded the joint-performance more accurately than "canonical action-related cells", activated by the action per se, regardless of the individual versus interactive context. A decoding of joint-action was obtained by combining the two brains' activities, using cells with directional properties distinguished from those associated to the "solo" behaviors. Action observation-related activity studied when one monkey observed the consequences of the partner's behavior, i.e., the cursor's motion on the screen, did not sharpen the accuracy of joint-action cells' representation, suggesting that it plays no major role in encoding joint-action. When monkeys performed with a non-interactive partner, such as a computer, joint-action cells' representation of the other (non-cooperative) behavior was significantly degraded. These findings provide evidence of how premotor neurons integrate the timevarying representation of the self-action with that of a co-actor, thus offering a neural substrate for successful visuomotor coordination between individuals.The neural bases of intersubject motor coordination were studied by recording cell activity simultaneously from the frontal cortex of two interacting monkeys, trained to coordinate their hand forces to achieve a common goal. We found a new class of cells, preferentially active when the monkeys cooperated, rather than when the same action was performed individually. These "jointaction neurons" offered a neural representation of joint-behaviors by far more accurate than that provided by the "canonical action-related cells", modulated by the action per se regardless of the individual/interactive context. A neural representation of joint-performance was obtained by combining the activity recorded from the two brains. Our findings offer the first evidence concerning neural mechanisms subtending interactive visuomotor coordination between co-acting agents.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.