Figure 1: In this work we investigate thermal attacks against PINs and patterns on mobile devices. After entering PINs (a-c) or patterns (d-f) on a touch screen, a heat trace remains on the screen and can be made visible via thermal imaging.
Current digital systems are largely blind to users' cognitive states. Systems that adapt to users' states show great potential for augmenting cognition and for creating novel user experiences. However, most approaches for sensing cognitive states, and cognitive load specifically, involve obtrusive technologies, such as physiological sensors attached to users' bodies. This paper present an unobtrusive indicator of the users' cognitive load based on thermal imaging that is applicable in real-world. We use a commercial thermal camera to monitor a person's forehead and nose temperature changes to estimate their cognitive load. To assess the effect of different levels of cognitive load on facial temperature we conducted a user study with 12 participants. The study showed that different levels of the Stroop test and the complexity of reading texts affect facial temperature patterns, thereby giving a measure of cognitive load. To validate the feasibility for real-time assessments of cognitive load, we conducted a second study with 24 participants, we analyzed the temporal latency of temperature changes. Our system detected temperature changes with an average latency of 0.7 seconds after users were exposed to a stimulus, outperforming latency in related work that used other thermal imaging techniques. We provide empirical evidence showing how to unobtrusively detect changes in cognitive load in real-time. Our exploration of exposing users to different content types gives rise to thermal-based activity tracking, which facilitates new applications in the field of cognition-aware computing. CCS Concepts: • Human-centered computing → Human computer interaction (HCI); • Computing methodologies → Cognitive science; • Hardware → Displays and imagers;
In this paper, we investigate nine different visual representations of gaze in a competitive digital game setting. We evaluate the ability of spectators to infer a player's intentions in the game for each visual representation. Our results show that spectators have a remarkable ability to infer intent accurately using all nine visualizations, but that visualizations with certain characteristics were more comprehensible and more readily revealed the player's intent. The real-time Heatmap visualization was the most highly preferred by participants and the most effective in revealing intent, due to its ability to balance real-time gaze information with a persistent summary of recent gaze behaviour. Our findings show that eye-tracking visualization can enable playful interactions in competitive games based on players' ability to interpret opponents' attention and intention through gaze information.
a) Aluminum (b) Glass (c) MDF (d) Tile Figure 1: We explored the thermal reflectivity of different surfaces for interaction with projected surfaces. The thermal reflectivity allows to sense users that perform in-air gestures inside and outside the thermal camera's direct field-of-view. Four of the eight surfaces we analyzed and which can be found in normal office environments are presented above. ABSTRACTThermal cameras have recently drawn the attention of HCI researchers as a new sensory system enabling novel interactive systems. They are robust to illumination changes and make it easy to separate human bodies from the image background. Far-infrared radiation, however, has another characteristic that distinguishes thermal cameras from their RGB or depth counterparts, namely thermal reflection. Common surfaces reflect thermal radiation differently than visual light and can be perfect thermal mirrors. In this paper, we show that through thermal reflection, thermal cameras can sense the space beyond their direct field-of-view. A thermal camera can sense areas besides and even behind its field-of-view through thermal reflection. We investigate how thermal reflection can increase the interaction space of projected surfaces using camera-projection systems. We moreover discuss the reflection characteristics of common surfaces in our vicinity in both the visual and thermal radiation bands. Using a proof-of-concept prototype, we demonstrate the increased interaction space for hand-held camera-projection system. Furthermore, we depict a number of promising application examples that can benefit from the thermal reflection characteristics of surfaces.
Figure 1: Thermal images of graphical passwords entered on a smartphone's touchscreen (1 and 2) and a laptop's touchpad (3 and 4) were visually inspected by participants, who recovered 60.65% of touch gestures (2 and 4), and 23.61% of touch taps (1 and 3). Attacks against touchscreens are more accurate (87.04% vs 56.02%). The red circles/arrows illustrate the user's input.
Recent work demonstrated the exciting opportunities that thermal imaging offers for the development of interactive systems. It was shown that a thermal camera can sense when a user touches a surface, performs gestures in the camera's direct field of view and, in addition, performs gestures outside the camera's direct field of view through thermal reflection. In this paper, we investigate the material properties that should be considered for detecting interaction using thermal imaging considering both in-and outdoor settings. We conducted a study to analyze the recognition performance for different gestures and different surfaces. Using the results, we derive guidelines on material properties of surfaces for detecting on-surface as well as mid-air interaction using a thermal camera. We discuss the constrains that should be taken into account using thermal imaging as the sensing technology. Finally, we present a material space based on our findings. The space depicts surfaces and the required properties that enable the different interaction techniques.
Despite the importance of attention in user performance, current methods for attention classification do not allow to discriminate between different attention types. We propose a novel method that combines thermal imaging and eye tracking to unobtrusively classify four types of attention: sustained, alternating, selective, and divided. We collected a data set in which we stimulate these four attention types in a user study (N = 22) using combinations of audio and visual stimuli while measuring users' facial temperature and eye movement. Using a Logistic Regression on features extracted from both sensing technologies, we can classify the four attention types with high AUC scores up to 75.7% for the user independent-condition independent, 87% for the user-independent-condition dependent, and 77.4% for the user-dependent prediction. Our findings not only demonstrate the potential of thermal imaging and eye tracking for unobtrusive classification of different attention types but also pave the way for novel applications for attentive user interfaces and attention-aware computing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.