BACKGROUND: Assessing the presence of pharyngeal residue in the pyriform sinus and epiglottic vallecula is important because insufficient pharyngeal clearance is a risk factor for aspiration pneumonia. Improvements in the performance of ultrasound to visualize the pyriform sinus and epiglottic vallecula are needed. The aim of this study was to establish a method to visualize the pyriform sinus and epiglottic vallecula with ultrasound to detect pharyngeal residue. METHODS: We used real-time virtual sonography (ie, a fusion of magnetic resonance imaging and ultrasound imaging) as the scanning method to visualize the pyriform sinus and epiglottic vallecula without residue in 4 healthy individuals. Using established ultrasound methodology and fiberoptic endoscopic evaluation of swallowing, 35 subjects with dysphagia were studied to investigate the performance of ultrasound to detect pharyngeal residue. RESULTS: The fusion ultrasound images showed that transverse scans at the level of the laryngeal prominence and above the hyoid bone using a linear array transducer can be used to visualize the pyriform sinus and the epiglottic vallecula, respectively. We obtained 238 ultrasound images of the pyriform sinus from 35 subjects and 82 images of epiglottic vallecula from 26 of 35 subjects. The ultrasound images with fiberoptic endoscopic evaluation of swallowing showed that areas of high echogenicity in the pyriform sinus and epiglottic vallecula are related to the presence of pharyngeal residue. The presence of high-echogenicity areas resulted in a sensitivity of 92.0% and specificity of 71.9% for detecting pharyngeal residue in the pyriform sinus and a sensitivity of 86.7% and specificity of 63.6% for detecting pharyngeal residue in the epiglottic vallecula. CONCLUSIONS: Transverse ultrasound scans at the level of the laryngeal prominence and above the hyoid bone enable the visualization of the pyriform sinus, epiglottic vallecula, and pharyngeal residue.
The classification of ultrasound (US) findings of pressure injury is important to select the appropriate treatment and care based on the state of the deep tissue, but it depends on the operator’s skill in image interpretation. Therefore, US for pressure injury is a procedure that can only be performed by a limited number of highly trained medical professionals. This study aimed to develop an automatic US image classification system for pressure injury based on deep learning that can be used by non-specialists who do not have a high skill in image interpretation. A total 787 training data were collected at two hospitals in Japan. The US images of pressure injuries were assessed using the deep learning-based classification tool according to the following visual evidence: unclear layer structure, cobblestone-like pattern, cloud-like pattern, and anechoic pattern. Thereafter, accuracy was assessed using two parameters: detection performance, and the value of the intersection over union (IoU) and DICE score. A total of 73 images were analyzed as test data. Of all 73 images with an unclear layer structure, 7 showed a cobblestone-like pattern, 14 showed a cloud-like pattern, and 15 showed an anechoic area. All four US findings showed a detection performance of 71.4–100%, with a mean value of 0.38–0.80 for IoU and 0.51–0.89 for the DICE score. The results show that US findings and deep learning-based classification can be used to detect deep tissue pressure injuries.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.