Ubiquitous blood pressure (BP) monitoring is needed to improve hypertension detection and control and is becoming feasible due to recent technological advances such as in wearable sensing. Pulse transit time (PTT) represents a well-known, potential approach for ubiquitous BP monitoring. The goal of this review is to facilitate the achievement of reliable, ubiquitous BP monitoring via PTT. We explain the conventional BP measurement methods and their limitations; present models to summarize the theory of the PTT-BP relationship; outline the approach while pinpointing the key challenges; overview the previous work towards putting the theory to practice; make suggestions for best practice and future research; and discuss realistic expectations for the approach.
We describe a novel method to monitor pulse rate (PR) on a continuous basis of patients in a neonatal intensive care unit (NICU) using videos taken from a high definition (HD) webcam. We describe algorithms that determine PR from videoplethysmographic (VPG) signals extracted from multiple regions of interest (ROI) simultaneously available within the field of view of the camera where cardiac signal is registered. We detect motion from video images and compensate for motion artifacts from each ROI. Preliminary clinical results are presented on 8 neonates each with 30 minutes of uninterrupted video. Comparisons to hospital equipment indicate that the proposed technology can meet medical industry standards and give improved patient comfort and ease of use for practitioners when instrumented with proper hardware.
Abstract. Nonobtrusive pulse rate measurement using a webcam is considered. We demonstrate how state-of-theart algorithms based on independent component analysis suffer from a sorting problem which hinders their performance, and propose a novel algorithm based on constrained independent component analysis to improve performance. We present how the proposed algorithm extracts a photoplethysmography signal and resolves the sorting problem. In addition, we perform a comparative study between the proposed algorithm and state-of-the-art algorithms over 45 video streams using a finger probe oxymeter for reference measurements. The proposed algorithm provides improved accuracy: the root mean square error is decreased from 20.6 and 9.5 beats per minute (bpm) for existing algorithms to 3.5 bpm for the proposed algorithm. An error of 3.5 bpm is within the inaccuracy expected from the reference measurements. This implies that the proposed algorithm provided performance of equal accuracy to the finger probe oximeter.
We propose a novel approach to segment hand regions in egocentric video that requires no manual labeling of training samples. The user wearing a head-mounted camera is prompted to perform a simple gesture during an initial calibration step. A combination of color and motion analysis that exploits knowledge of the expected gesture is applied on the calibration video frames to automatically label hand pixels in an unsupervised fashion. The hand pixels identified in this manner are used to train a statisticalmodel-based hand detector. Superpixel region growing is used to perform segmentation refinement and improve robustness to noise. Experiments show that our hand detection technique based on the proposed on-the-fly training approach significantly outperforms state-of-the-art techniques with respect to accuracy and robustness on a variety of challenging videos. This is due primarily to the fact that training samples are personalized to a specific user and environmental conditions. We also demonstrate the utility of our hand detection technique to inform an adaptive video sampling strategy that improves both computational speed and accuracy of egocentric action recognition algorithms. Finally, we offer an egocentric video dataset of an insulin self-injection procedure with action labels and hand masks that can serve towards future research on both hand detection and egocentric action recognition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.