Nowadays, the use of wearable devices is spreading in different fields of application, such as healthcare, digital health, and sports monitoring. In sport applications, the present trend is to continuously monitor the athletes’ physiological parameters during training or competitions to maximize performance and support coaches. This paper aims to evaluate the performances in heart rate assessment, in terms of accuracy and precision, of both wrist-worn and chest-strap commercial devices used during swimming activity, considering a test population of 10 expert swimmers. Three devices were employed: Polar H10 cardiac belt, Polar Vantage V2, and Garmin Venu Sq smartwatches. The former was used as a reference device to validate the data measured by the two smartwatches. Tests were performed both in dry and wet conditions, considering walking/running on a treadmill and different swimming styles in water, respectively. The measurement accuracy and precision were evaluated through standard methods, i.e., Bland–Altman plot, analysis of deviations, and Pearson’s correlation coefficient. Results show that both precision and accuracy worsen during swimming activity (with an absolute increase of the measurement deviation in the range of 13–56 bpm for mean value and 49–52 bpm for standard deviation), proving how water and arms movement act as relevant interference inputs. Moreover, it was found that wearable performance decreases when activity intensity increases, highlighting the need for specific research for wearable applications in water, with a particular focus on swimming-related sports activities.
Background: Heartbeat detection is a crucial step in several clinical fields. Laser Doppler Vibrometer (LDV) is a promising non-contact measurement for heartbeat detection. The aim of this work is to assess whether machine learning can be used for detecting heartbeat from the carotid LDV signal. Methods: The performances of Support Vector Machine (SVM), Decision Tree (DT), Random Forest (RF) and K-Nearest Neighbor (KNN) were compared using the leave-one-subject-out cross-validation as the testing protocol in an LDV dataset collected from 28 subjects. The classification was conducted on LDV signal windows, which were labeled as beat, if containing a beat, or no-beat, otherwise. The labeling procedure was performed using electrocardiography as the gold standard. Results: For the beat class, the f1-score (f1) values were 0.93, 0.93, 0.95, 0.96 for RF, DT, KNN and SVM, respectively. No statistical differences were found between the classifiers. When testing the SVM on the full-length (10 min long) LDV signals, to simulate a real-world application, we achieved a median macro-f1 of 0.76. Conclusions: Using machine learning for heartbeat detection from carotid LDV signals showed encouraging results, representing a promising step in the field of contactless cardiovascular signal analysis.
Background: Surfactant dosing and effective delivery could affect continuous positive airways pressure (CPAP)-failure. Nevertheless, information on exogenous surfactant dosing with current administration methods is limited.Objective: To describe the effect of 100 or 200 mg/kg of surfactant as first-line treatment of respiratory distress syndrome in preterm infants of less than 32 weeks gestation.Study Design: A retrospective single-center cohort study comparing two epochs, before and after switching from 100 to 200 mg/kg surfactant therapy.Results: Six hundred and fifty-eight of the 1615 infants of less than 32 weeks were treated with surfactant: 282 received 100 mg/kg (S-100) and 376 received 200 mg/ kg (S-200). There were no differences between S-100 and S-200 in perinatal data including prenatal corticosteroids, medication use, age at first surfactant administration and respiratory severity before surfactant.The S-200 vs. S-100 had fewer retreatments (17.0% vs. 47.2%, p < 0.001) and a shorter duration of oxygen therapy and mechanical ventilation (315 vs. 339 h, p = 0.018; 37 vs. 118 h, p = 0.000, respectively). There was no difference in postnatal corticosteroid use (S-200 10.0% vs. S-100 11.0%, p = 0.361). Bronchopulmonary dysplasia (BPD) was significantly lower in S-200 vs. S-100 when comparing either the 4 and 6-year periods before and after the dose switch (29.4% vs. 15.7%, p = 0.003, and 18.7% vs. 27.3%, p = 0.024, respectively) Conclusions: The switch from 100 to 200 mg/kg was associated with a marked reduction in the need for surfactant redosing, respiratory support, and BPD. This
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.