It has been widely reported that women are underrepresented in leadership positions within academic medicine. This study aimed to assess trends in women representation as principal investigators (PIs) in oncology clinical trials and to characterize trends in women’s leadership in such trials conducted between 1999 and 2019. The gender of 39,240 PIs leading clinical trials was determined using the gender prediction software Genderize.io. In total, 11,516 (27.7%) women served as PIs. Over the past 20 years, an annual increase of 0.65% in women PIs was observed. Analysis by geographic distribution revealed higher women representation among PIs in North America and Europe compared to Asia. Industry-funded trials were associated with lower women PI representation than academic-funded trials (31.4% vs. 18.8%, p<0.001). Also, women PIs were found to be underrepresented in late-phase as compared to early-phase studies (27.9%, 25.7%, 21.6%, and 22.4% in phase I, II, III, and IV, respectively; Cochran-Armitage test for trend, p<0.001). Furthermore, an association was found between the PI’s gender and enrolment of female subjects (50% vs. 43% female participants led by women vs men PIs, respectively, p<0.001). Taken together, while the gender gap in women’s leadership in oncology trials has been steadily closing, prominent inequalities remain in non-Western countries, advanced study phases, industry-funded trials and appear to be linked to a gender gap in patient accrual. These observations can serve for the development of strategies to increase women’s representation and to monitor progress toward gender equality in PIs of cancer clinical trials.
Internal affective states produce external manifestations such as facial expressions. In humans, the Facial Action Coding System (FACS) is widely used to objectively quantify the elemental facial action-units (AUs) that build complex facial expressions. A similar system has been developed for macaque monkeys - the Macaque Facial Action Coding System (MaqFACS); yet unlike the human counterpart, which is already partially replaced by automatic algorithms, this system still requires labor-intensive coding. Here, we developed and implemented the first prototype for automatic MaqFACS coding. We applied the approach to the analysis of behavioral and neural data recorded from freely interacting macaque monkeys. The method achieved high performance in recognition of six dominant AUs, generalizing between conspecific individuals (Macaca mulatta) and even between species (Macaca fascicularis). The study lays the foundation for fully automated detection of facial expressions in animals, which is crucial for investigating the neural substrates of social and affective states.
Internal affective states produce external manifestations such as facial expressions. In humans, the Facial Action Coding System (FACS) is widely used to objectively quantify the elemental facial action-units (AUs) that build complex facial expressions. A similar system has been developed for macaque monkeys -the Macaque Facial Action Coding System (MaqFACS); yet unlike the human counterpart, which is already partially replaced by automatic algorithms, this system still requires labor-intensive coding. Here, we developed and implemented the first prototype for automatic MaqFACS coding. We applied the approach to the analysis of behavioral and neural data recorded from freely interacting macaque monkeys. The method achieved high performance in recognition of six dominant AUs, generalizing between conspecific individuals (Macaca mulatta) and even between species (Macaca fascicularis). The study lays the foundation for fully automated detection of facial expressions in animals, which is crucial for investigating the neural substrates of social and affective states. Significance StatementMaqFACS is a comprehensive coding system designed to objectively classify facial expressions based on elemental facial movements designated as Actions Units (AUs). It allows the comparison of facial expressions across individuals of same or different species based on manual scoring of videos, a laborand time-consuming process. We implemented the first automatic prototype for AUs coding in macaques. Using machine learning, we trained the algorithm on video-frames with AU labels, and showed that after parameter tuning, it classified six AUs in new individuals. Our method demonstrates concurrent validity with manual MaqFACS coding and supports the usage of automated MaqFACS. Such automatic coding is useful not only for social-and affective-neuroscience research but also for monitoring animal health and welfare.The archive "autoMaqFACS_code.zip" contains Matlab code for autoMaqFACS classification.
The eye-gaze of others is a prominent social cue in primates and crucial for communication [1][2][3][4][5][6][7] , and atypical processing occurs in several conditions as autism-spectrum-disorder (ASD) 1,[9][10][11][12][13][14] . The neural mechanisms that underlie eye-gaze remain vague, and it is still debated if these computations developed in dedicated neural circuits or shared with non-social elements. In many species, eye-gaze signals a threat and elicits anxiety, yet can also serve as a predictor for the outcome of the encounter: negative or positive 2,4,8 . Here, we hypothesized and find that neural codes overlap between eye-gaze and valence. Monkeys participated in a modified version of the human-intruder-test 8,15 that includes direct and averted eye-gaze and interleaved with blocks of aversive and appetitive conditioning 16,17 . We find that singleneurons in the amygdala encode gaze 18 , whereas neurons in the anterior-cingulate-cortex encode the social context 19,20 but not gaze. We identify a shared amygdala circuitry where neural responses to averted and direct gaze parallel the responses to appetitive and aversive value, correspondingly. Importantly, we distinguish two shared coding mechanisms: a shared-intensity scheme that is used for gaze and the unconditioned-stimulus, and a shared-activity scheme that is used for gaze and the conditioned-stimulus. The shared-intensity points to overlap in circuitry, whereas the shared-activity requires also correlated activity. Our results demonstrate that eye-gaze is coded as a signal of valence, yet also as the expected value of the interaction. The findings may suggest new insights into the mechanisms that underlie the malfunction of eye-gaze in ASD and the comorbidity with impaired social skills and anxiety. Haran and Fanny Attar for MRI procedures. This work was supported by ISF #2352/19 and ERC-2016-CoG #724910 grants to R. Paz. Main textRecognizing and learning about potentially harmful or beneficial stimuli is crucial for survival of all organisms. In humans and primates in general, facial expressions, and in particular eye gaze of others, are a prominent and instructive signal 2-7 . Averted or directed eye-gaze is a social signal that can indicate submissive vs. aggressive interactions, correspondingly. In agreement with this, eye-gaze was shown to elicit anxiety in primates 4,8 , and evoke responses in the amygdala 18,21-28 -a brain region that serves as a hub for emotional responses in general and anxiety in particular 25,29,30 . Moreover, gaze processing is disrupted in several neurodevelopmental and social disorders 1,[9][10][11][12] , and mainly in autism-spectrumdisorder (ASD) where abnormal activity of the amygdala is linked to gaze avoidance 13,14 . However, eyegaze is not only a valence-signal by itself, but can also serve as a predictor for future outcomes: aversive if an intruder makes direct eye-contact, or potentially rewarding if the intruder avoids eye-contact. This is in line with the amygdala not only playing a role in signaling outcom...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.