The aging population, prevalence of chronic diseases, and outbreaks of infectious diseases are some of the major challenges of our present-day society. To address these unmet healthcare needs, especially for the early prediction and treatment of major diseases, health informatics, which deals with the acquisition, transmission, processing, storage, retrieval, and use of health information, has emerged as an active area of interdisciplinary research. In particular, acquisition of health-related information by unobtrusive sensing and wearable technologies is considered as a cornerstone in health informatics. Sensors can be weaved or integrated into clothing, accessories, and the living environment, such that health information can be acquired seamlessly and pervasively in daily living. Sensors can even be designed as stick-on electronic tattoos or directly printed onto human skin to enable long-term health monitoring. This paper aims to provide an overview of four emerging unobtrusive and wearable technologies, which are essential to the realization of pervasive health information acquisition, including: (1) unobtrusive sensing methods, (2) smart textile technology, (3) flexible-stretchable-printable electronics, and (4) sensor fusion, and then to identify some future directions of research.
This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.
After decades of evolution, measuring instruments for quantitative gait analysis have become an important clinical tool for assessing pathologies manifested by gait abnormalities. However, such instruments tend to be expensive and require expert operation and maintenance besides their high cost, thus limiting them to only a small number of specialized centers. Consequently, gait analysis in most clinics today still relies on observation-based assessment. Recent advances in wearable sensors, especially inertial body sensors, have opened up a promising future for gait analysis. Not only can these sensors be more easily adopted in clinical diagnosis and treatment procedures than their current counterparts, but they can also monitor gait continuously outside clinics - hence providing seamless patient analysis from clinics to free-living environments. The purpose of this paper is to provide a systematic review of current techniques for quantitative gait analysis and to propose key metrics for evaluating both existing and emerging methods for qualifying the gait features extracted from wearable sensors. It aims to highlight key advances in this rapidly evolving research field and outline potential future directions for both research and clinical applications.
The increasing popularity of wearable devices in recent years means that a diverse range of physiological and functional data can now be captured continuously for applications in sports, wellbeing, and healthcare. This wealth of information requires efficient methods of classification and analysis where deep learning is a promising technique for large-scale data analytics. While deep learning has been successful in implementations that utilize high-performance computing platforms, its use on low-power wearable devices is limited by resource constraints. In this paper, we propose a deep learning methodology, which combines features learned from inertial sensor data together with complementary information from a set of shallow features to enable accurate and real-time activity classification. The design of this combined method aims to overcome some of the limitations present in a typical deep learning framework where on-node computation is required. To optimize the proposed method for real-time on-node computation, spectral domain preprocessing is used before the data are passed onto the deep learning framework. The classification accuracy of our proposed deep learning approach is evaluated against state-of-the-art methods using both laboratory and real world activity datasets. Our results show the validity of the approach on different human activity datasets, outperforming other methods, including the two methods used within our combined pipeline. We also demonstrate that the computation times for the proposed method are consistent with the constraints of real-time on-node processing on smartphones and a wearable sensor platform.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.