Minimally invasive medical procedures have become increasingly common in today's healthcare practice. Images taken during such procedures largely show tissues of human organs, such as the mucosa of the gastrointestinal tract. These surfaces usually have a glossy appearance showing specular highlights. For many visual analysis algorithms, these distinct and bright visual features can become a significant source of error. In this article, we propose two methods to address this problem: (a) a segmentation method based on nonlinear filtering and colour image thresholding and (b) an efficient inpainting method. The inpainting algorithm eliminates the negative effect of specular highlights on other image analysis algorithms and also gives a visually pleasing result. The methods compare favourably to the existing approaches reported for endoscopic imaging. Furthermore, in contrast to the existing approaches, the proposed segmentation method is applicable to the widely used sequential RGB image acquisition systems.
Abstract. Colonoscopy is one of the best methods for screening colon cancer. A variety of research groups have proposed methods for automatic detection of polyps in colonoscopic images to support the doctors during examination. However, the problem can still not be assumed as solved. The major drawback of many approaches is the amount and quality of images used for classifier training and evaluation. Our database consists of more than four hours of high resolution video from colonoscopies which were examined and labeled by medical experts. We applied four methods of texture feature extraction based on Grey-LevelCo-occurence and Local-Binary-Patterns. Using this data, we achieved classification results with an area under the ROC-curve of up to 0.96.
This paper describes the use of a Bayesian network to provide context-aware shared control of a robot mobility aid for the frail blind. The robot mobility aid, PAM-AID, is a "smart walker" that aims to assist the frail and elderly blind to walk safely indoors. The Bayesian network combines user input with high-level information derived from the sensors to provide a context-aware estimate of the user's current navigation goals. This context-aware action selection mechanism facilitates the use of a very simple, low bandwidth user interface, which is critical for the elderly user group. The PAM-AID systems have been evaluated through a series of field trails involving over 30 potential users.
Purpose of Review Artificial intelligence (AI) offers huge potential in infection prevention and control (IPC). We explore its potential IPC benefits in epidemiology, laboratory infection diagnosis, and hand hygiene. Recent Findings AI has the potential to detect transmission events during outbreaks or predict high-risk patients, enabling development of tailored IPC interventions. AI offers opportunities to enhance diagnostics with objective pattern recognition, standardize the diagnosis of infections with IPC implications, and facilitate the dissemination of IPC expertise. AI hand hygiene applications can deliver behavior change, though it requires further evaluation in different clinical settings. However, staff can become dependent on automatic reminders, and performance returns to baseline if feedback is removed. Summary Advantages for IPC include speed, consistency, and capability of handling infinitely large datasets. However, many challenges remain; improving the availability of high-quality representative datasets and consideration of biases within preexisting databases are important challenges for future developments. AI in itself will not improve IPC; this requires culture and behavior change. Most studies to date assess performance retrospectively so there is a need for prospective evaluation in the real-life, often chaotic, clinical setting. Close collaboration with IPC experts to interpret outputs and ensure clinical relevance is essential.
Hand washing is a critical activity in preventing the spread of infection in health-care environments and food preparation areas. Several guidelines recommended a hand washing protocol consisting of six steps that ensure that all areas of the hands are thoroughly cleaned. In this paper, we describe a novel approach that uses a computer vision system to measure the user's hands motions to ensure that the hand washing guidelines are followed. A hand washing quality assessment system needs to know if the hands are joined or separated and it has to be robust to different lighting conditions, occlusions, reflections and changes in the color of the sink surface. This work presents three main contributions: a description of a system which delivers robust hands segmentation using a combination of color and motion analysis, a single multi-modal particle filter (PF) in combination with a k-means-based clustering technique to track both hands/arms, and the implementation of a multi-class classification of hand gestures using a support vector machine ensemble. PF performance is discussed and compared with a standard Kalman filter estimator. Finally, the global perfor-Electronic supplementary material The online version of this article
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.