BACKGROUND: Current pain assessment methods in youth are suboptimal and vulnerable to bias and underrecognition of clinical pain. Facial expressions are a sensitive, specific biomarker of the presence and severity of pain, and computer vision (CV) and machine-learning (ML) techniques enable reliable, valid measurement of pain-related facial expressions from video. We developed and evaluated a CVML approach to measure pain-related facial expressions for automated pain assessment in youth. METHODS:A CVML-based model for assessment of pediatric postoperative pain was developed from videos of 50 neurotypical youth 5 to 18 years old in both endogenous/ongoing and exogenous/transient pain conditions after laparoscopic appendectomy. Model accuracy was assessed for self-reported pain ratings in children and time since surgery, and compared with by-proxy parent and nurse estimates of observed pain in youth.RESULTS: Model detection of pain versus no-pain demonstrated good-to-excellent accuracy (Area under the receiver operating characteristic curve 0.84-0.94) in both ongoing and transient pain conditions. Model detection of pain severity demonstrated moderate-to-strong correlations (r = 0.65-0.86 within; r = 0.47-0.61 across subjects) for both pain conditions. The model performed equivalently to nurses but not as well as parents in detecting pain versus no-pain conditions, but performed equivalently to parents in estimating pain severity. Nurses were more likely than the model to underestimate youth self-reported pain ratings. Demographic factors did not affect model performance.CONCLUSIONS: CVML pain assessment models derived from automatic facial expression measurements demonstrated good-to-excellent accuracy in binary pain classifications, strong correlations with patient self-reported pain ratings, and parent-equivalent estimation of children's pain levels over typical pain trajectories in youth after appendectomy. WHAT'S KNOWN ON THIS SUBJECT:Clinical pain assessment methods in youth are vulnerable to underestimation bias and underrecognition. Facial expressions are sensitive, specific biomarkers of the presence and severity of pain. Computer vision-based pattern recognition enables measurement of painrelated facial expressions from video. WHAT THIS STUDY ADDS:This study demonstrates initial validity for developing computer vision algorithms for automated pain assessment in children. The system developed and tested in this study could provide standardized, continuous, and valid patient monitoring that is potentially scalable. Mr Sikka performed the machine learning under the guidance of Dr Bartlett, drafted the initial manuscript, and reviewed and revised the manuscript; Mr Ahmed carried out a portion of the initial analyses and reviewed and revised the manuscript; Dr Diaz performed data collection, performed a portion of the initial analyses, and reviewed and revised the manuscript; Drs Craig and Goodwin reviewed all analyses, and critically reviewed and revised the manuscript; Drs Bartlett and Huang concep...
Objective pain assessment is required for appropriate pain management in the clinical setting. However, clinical gold standard pain assessment is based on subjective methods. Automated pain detection from physiological data may provide important objective information to better standardize pain assessment. Specifically, electrodermal activity (EDA) can identify features of stress and anxiety induced by varying pain levels. However, notable variability in EDA measurement exists and research to date has demonstrated sensitivity but lack of specificity in pain assessment. In this paper, we use timescale decomposition (TSD) to extract salient features from EDA signals to identify an accurate and automated EDA pain detection algorithm to sensitively and specifically distinguish pain from no-pain conditions.
Despite recent advances in asthma management with anti–IL-5 therapies, many patients have eosinophilic asthma that remains poorly controlled. IL-3 shares a common β subunit receptor with both IL-5 and GM-CSF but, through α-subunit–specific properties, uniquely influences eosinophil biology and may serve as a potential therapeutic target. We aimed to globally characterize the transcriptomic profiles of GM-CSF, IL-3, and IL-5 stimulation on human circulating eosinophils and identify differences in gene expression using advanced statistical modeling. Human eosinophils were isolated from the peripheral blood of healthy volunteers and stimulated with either GM-CSF, IL-3, or IL-5 for 48 h. RNA was then extracted and bulk sequencing performed. DESeq analysis identified differentially expressed genes and weighted gene coexpression network analysis independently defined modules of genes that are highly coexpressed. GM-CSF, IL-3, and IL-5 commonly upregulated 252 genes and downregulated 553 genes, producing a proinflammatory and survival phenotype that was predominantly mediated through TWEAK signaling. IL-3 stimulation yielded the most numbers of differentially expressed genes that were also highly coexpressed (n = 119). These genes were enriched in pathways involving JAK/STAT signaling. GM-CSF and IL-5 stimulation demonstrated redundancy in eosinophil gene expression. In conclusion, IL-3 produces a distinct eosinophil gene expression program among the β-chain receptor cytokines. IL-3–upregulated genes may provide a foundation for research into therapeutics for patients with eosinophilic asthma who do not respond to anti–IL-5 therapies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.