In this paper, we investigate nine different visual representations of gaze in a competitive digital game setting. We evaluate the ability of spectators to infer a player's intentions in the game for each visual representation. Our results show that spectators have a remarkable ability to infer intent accurately using all nine visualizations, but that visualizations with certain characteristics were more comprehensible and more readily revealed the player's intent. The real-time Heatmap visualization was the most highly preferred by participants and the most effective in revealing intent, due to its ability to balance real-time gaze information with a persistent summary of recent gaze behaviour. Our findings show that eye-tracking visualization can enable playful interactions in competitive games based on players' ability to interpret opponents' attention and intention through gaze information.
Selection is a canonical task in user interfaces, commonly supported by presenting objects for acquisition by pointing. In this article, we consider motion correlation as an alternative for selection. The principle is to represent available objects by motion in the interface, have users identify a target by mimicking its specific motion, and use the correlation between the system’s output with the user’s input to determine the selection. The resulting interaction has compelling properties, as users are guided by motion feedback, and only need to copy a presented motion. Motion correlation has been explored in earlier work but only recently begun to feature in holistic interface designs. We provide a first comprehensive review of the principle, and present an analysis of five previously published works, in which motion correlation underpinned the design of novel gaze and gesture interfaces for diverse application contexts. We derive guidelines for motion correlation algorithms, motion feedback, choice of modalities, overall design of motion correlation interfaces, and identify opportunities and challenges identified for future research and design.
Background Hand hygiene is one of the most effective ways of preventing health care–associated infections and reducing their transmission. Owing to recent advances in sensing technologies, electronic hand hygiene monitoring systems have been integrated into the daily routines of health care workers to measure their hand hygiene compliance and quality. Objective This review aims to summarize the latest technologies adopted in electronic hand hygiene monitoring systems and discuss the capabilities and limitations of these systems. Methods A systematic search of PubMed, ACM Digital Library, and IEEE Xplore Digital Library was performed following the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. Studies were initially screened and assessed independently by the 2 authors, and disagreements between them were further summarized and resolved by discussion with the senior author. Results In total, 1035 publications were retrieved by the search queries; of the 1035 papers, 89 (8.60%) fulfilled the eligibility criteria and were retained for review. In summary, 73 studies used electronic monitoring systems to monitor hand hygiene compliance, including application-assisted direct observation (5/73, 7%), camera-assisted observation (10/73, 14%), sensor-assisted observation (29/73, 40%), and real-time locating system (32/73, 44%). A total of 21 studies evaluated hand hygiene quality, consisting of compliance with the World Health Organization 6-step hand hygiene techniques (14/21, 67%) and surface coverage or illumination reduction of fluorescent substances (7/21, 33%). Conclusions Electronic hand hygiene monitoring systems face issues of accuracy, data integration, privacy and confidentiality, usability, associated costs, and infrastructure improvements. Moreover, this review found that standardized measurement tools to evaluate system performance are lacking; thus, future research is needed to establish standardized metrics to measure system performance differences among electronic hand hygiene monitoring systems. Furthermore, with sensing technologies and algorithms continually advancing, more research is needed on their implementation to improve system performance and address other hand hygiene–related issues.
Human activity recognition (HAR) is an important research area due to its potential for building context-aware interactive systems. Though movement-based activity recognition is an established area of research, recognising sedentary activities remains an open research question. Previous works have explored eye-based activity recognition as a potential approach for this challenge, focusing on statistical measures derived from eye movement properties---low-level gaze features---or some knowledge of the Areas-of-Interest (AOI) of the stimulus---high-level gaze features. In this paper, we extend this body of work by employing the addition of mid-level gaze features; features that add a level of abstraction over low-level features with some knowledge of the activity, but not of the stimulus. We evaluated our approach on a dataset collected from 24 participants performing eight desktop computing activities. We trained a classifier extending 26 low-level features derived from existing literature with the addition of 24 novel candidate mid-level gaze features. Our results show an overall classification performance of 0.72 (F1-Score), with up to 4% increase in accuracy when adding our mid-level gaze features. Finally, we discuss the implications of combining low- and mid-level gaze features, as well as the future directions for eye-based activity recognition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.