Practitioners in many fields of human-computer interaction are now using physiological data to measure different aspects of user experience. The dynamic nature of physiological data offers a continuous window to the users and allows a better understanding of their experience while interacting with a system. However, in order to be truly informative, physiological signals need to be closely linked to users' behaviors and interaction states. This paper presents an analysis method that provides a direct visual interpretation of users' physiological signals when interacting with an interface. The proposed physiological heatmap tool uses eyetracking data along with physiological signals to identify regions where users are experiencing different emotional and cognitive states with a higher frequency. The method was evaluated in an experiment with 44 participants. Results show that physiological heatmaps are able to identify emotionally significant regions within an interface better than standard gaze heatmaps. Applications of the method to different fields of HCI research are also discussed.
a b s t r a c tThe treadmill desk is a new human-computer interaction (HCI) setup intended to reduce the time workers spend sitting. As most workers will not choose to spend their entire workday walking, this study investigated the short-term delayed effect of treadmill desk usage. An experiment was conducted in which participants either sat or walked while they read a text and received emails. Afterward, all participants performed a task to evaluate their attention and memory. Behavioral, neurophysiological, and perceptual evidence showed that participants who walked had a short-term increase in memory and attention, indicating that the use of a treadmill desk has a delayed effect. These findings suggest that the treadmill desk, in addition to having health benefits for workers, can also be beneficial for businesses by enhancing workforce performance.
In a recent theoretical synthesis on the concept of engagement, Fredricks, Blumenfeld and Paris 1 defined engagement by its multiple dimensions: behavioral, emotional and cognitive. They observed that individual types of engagement had not been studied in conjunction, and little information was available about interactions or synergy between the dimensions; consequently, more studies would contribute to creating finely tuned teaching interventions. Benefiting from the recent technological advances in neurosciences, this paper presents a recently developed methodology to gather and synchronize data on multidimensional engagement during learning tasks. The technique involves the collection of (a) electroencephalography, (b) electrodermal, (c) eye-tracking, and (d) facial emotion recognition data on four different computers. This led to synchronization issues for data collected from multiple sources. Post synchronization in specialized integration software gives researchers a better understanding of the dynamics between the multiple dimensions of engagement. For curriculum developers, these data could provide informed guidelines for achieving better instruction/learning efficiency. This technique also opens up possibilities in the field of brain-computer interactions, where adaptive learning or assessment environments could be developed.
Video LinkThe video component of this article can be found at
This paper introduces the eye-fixation related potential (EFRP) method to IS research. The EFRP method allows one to synchronize eye tracking with electroencephalographic (EEG) recording to precisely capture users' neural activity at the exact time at which they start to cognitively process a stimulus (e.g., event on the screen). This complements and overcomes some of the shortcomings of the traditional event related potential (ERP) method, which can only stamp the time at which a stimulus is presented to a user. Thus, we propose a method conjecture of the superiority of EFRP over ERP for capturing the cognitive processing of a stimulus when such cognitive processing is not necessarily synchronized with the time at which the stimulus appears. We illustrate the EFRP method with an experiment in a natural IS use context in which we asked users to read an industry report while email pop-up notifications arrived on their screen. The results support our proposed hypotheses and show three distinct neural processes associated with 1) the attentional reaction to email pop-up notification, 2) the cognitive processing of the email pop-up notification, and 3) the motor planning activity involved in opening or not the email. Furthermore, further analyses of the data gathered in the experiment serve to validate our method conjecture about the superiority of the EFRP method over the ERP in natural IS use contexts. In addition to the experiment, our study discusses important IS research questions that could be pursued with the aid of EFRP, and describes a set of guidelines to help IS researchers use this method.
a b s t r a c tThe need for intelligent HCI has been reinforced by the increasing numbers of human-centered applications in our daily life. However, in order to respond adequately, intelligent applications must first interpret users' actions. Identifying the context in which users' interactions occur is an important step toward automatic interpretation of behavior. In order to address a part of this context-sensing problem, we propose a generic and application-independent framework for activity recognition of users interacting with a computer interface. Our approach uses Layered Hidden Markov Models (LHMM) and is based on eye-gaze movements along with keyboard and mouse interactions. The main contribution of the proposed framework is the ability to relate users' interactions to a task model in variant applications and for different monitoring purposes. Experimental results from two user studies show that our activity recognition technique is able to achieve good predictive accuracy with a relatively small amount of training data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.