The objective measurement of subjective, multi-dimensionally experienced pain is a problem for which there has not been an adequate solution. Although verbal methods (e.g., pain scales and questionnaires) are commonly used to measure clinical pain, they tend to lack objectivity, reliability, or validity when applied to mentally impaired individuals. Biopotential and behavioral parameters may represent a solution. Such coding systems already exist, but they are either very costly or time-consuming or have not been sufficiently evaluated. In this context, we collected a database of biopotentials to advance an automated pain recognition system, determine its theoretical testing quality, and optimize its performance. For this purpose, participants were subjected to painful heat stimuli under controlled conditions. One hundred thirty-five features were extracted from the mathematical groupings of amplitude, frequency, stationarity, entropy, linearity, and variability. The following features were chosen as the most selective: (1) electromyography corrugator peak to peak, (2) corrugator shannon entropy, and (3) heart rate variability slope RR. Individual-specific calibration allows the adjustment of feature patterns, resulting in significantly more accurate pain detection rates. The objective measurement of pain in patients will provide valuable information for the clinical team, which may aid the objective assessment of treatment (e.g., effectiveness of drugs for pain reduction, information on surgical indication, and quality of care provided to patients).
Background: The working conditions at universities and hospitals are reported to be stressful. Several national and international studies have investigated occupational stress in hospitals. However, scientific studies at colleges and universities addressing psycho-social stress factors and their potential consequences are scarce. In this context, the consequences and correlations of the factor of work-family conflict, in particular, are currently uninvestigated. The aim of our study was to assess data on psychosocial stress in the context of the compatibility of work and family.Methods: Data were gathered through a cross-sectional-study, N = 844 (55% female, 41% male), on university staff (42.3% scientists, 14.3% physicians, 19.4% employees in administration, and 19.3% employees in service). Participants filled out questionnaires to provide their personal data and details of their work and private life conditions. For this purpose, we used the Work-Family and Family-Work Conflict Scales, Effort-Reward Inventory and Overcommitment Scale (ERI, OC), Patient Health Questionnaire (PHQ-4), short-form Maslach Burnout Inventory (MBI), and questions on their subjective health. Statistical analyses were performed using SPSS 22. Results:We found high levels of stress parameters in the total sample: extra work (83%), fixed-term work contracts (53%), overcommitment (OC, 26%), Effort-Reward Imbalance (18%, ERI Ratio > cut-off 0.715), work-family conflict (WFC, 35%), and family-work conflict (FWC, 39%). As hypothesized, we found significant correlations of both WFC and FWC with psychosocial work strain (ERI Ratio) as well as overcommitment (OC). Mental and somatic health parameters also had a significant positive correlation with WFC and FWC. Using a regression analysis (N = 844), we identified WFC as a predictor of burnout, while emotional exhaustion, extra work, and overcommitment could be identified as predictors of WFC and FWC. Discussion:The results of our study point toward deficits in the compatibility of work life and private life in the work fields of science, colleges, and universities. Furthermore, we Jerg-Bretzke et al. Work-Family Conflict, Occupational Stress, Health found indicators that work-family conflicts (interrole conflicts) have an impact on mental and somatic health. These work-family conflicts should be targets for preventions and interventions with the aim of improving the work-life balance and mental and somatic wellbeing of employees.
BackgroundResearch suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling machines to access to human affective states. Numerous studies have investigated the effects of valence emotions on facial EMG activity captured over the corrugator supercilii (frowning muscle) and zygomaticus major (smiling muscle). The arousal emotion, specifically, has not received much research attention, however. In the present study, we sought to identify intensive valence and arousal affective states via facial EMG activity.MethodsTen blocks of affective pictures were separated into five categories: neutral valence/low arousal (0VLA), positive valence/high arousal (PVHA), negative valence/high arousal (NVHA), positive valence/low arousal (PVLA), and negative valence/low arousal (NVLA), and the ability of each to elicit corresponding valence and arousal affective states was investigated at length. One hundred and thirteen participants were subjected to these stimuli and provided facial EMG. A set of 16 features based on the amplitude, frequency, predictability, and variability of signals was defined and classified using a support vector machine (SVM).ResultsWe observed highly accurate classification rates based on the combined corrugator and zygomaticus EMG, ranging from 75.69% to 100.00% for the baseline and five affective states (0VLA, PVHA, PVLA, NVHA, and NVLA) in all individuals. There were significant differences in classification rate accuracy between senior and young adults, but there was no significant difference between female and male participants.ConclusionOur research provides robust evidences for recognition of intensive valence and arousal affective states in young and senior adults. These findings contribute to the successful future application of facial EMG for identifying user affective states in human machine interaction (HMI) or companion robotic systems (CRS).
Pain assessment can benefit from observation of pain behaviors, such as guarding or facial expression, and observational pain scales are widely used in clinical practice with nonverbal patients. However, little is known about head movements and postures in the context of pain. In this regard, we analyze videos of three publically available datasets. The BioVid dataset was recorded with healthy participants subjected to painful heat stimuli. In the BP4D dataset, healthy participants performed a cold-pressor test and several other tasks (meant to elicit emotion). The UNBC dataset videos show shoulder pain patients during range-of-motion tests to their affected and unaffected limbs. In all videos, participants were sitting in an upright position. We studied head movements and postures that occurred during the painful and control trials by measuring head orientation from video over time, followed by analyzing posture and movement summary statistics and occurrence frequencies of typical postures and movements. We found significant differences between pain and control trials with analyses of variance and binomial tests. In BioVid and BP4D, pain was accompanied by head movements and postures that tend to be oriented downwards or towards the pain site. We also found differences in movement range and speed in all three datasets. The results suggest that head movements and postures should be considered for pain assessment and research. As additional pain indicators, they possibly might improve pain management whenever behavior is assessed, especially in nonverbal individuals such as infants or patients with dementia. However, in advance more research is needed to identify specific head movements and postures in pain patients.
The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009) reported that an overestimation of angry faces could only be found when the model’s gaze was oriented toward the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry, and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance, and an evolutionary context.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.