Abstract:Emotions are one of the unique aspects of human nature, and sadly at the same time one of the elements that our technological world is failing to capture and consider due to their subtlety and inherent complexity. But with the current dawn of new technologies that enable the interpretation of emotional states based on techniques involving facial expressions, speech and intonation, electrodermal response (EDS) and brain-computer interfaces (BCIs), we are finally able to access real-time user emotions in various… Show more
“…Creating emotion visualizations is a well-investigated research topic in HCI. Researchers have worked with various mediums for presenting visualizations; integrated into apps [5,13,21], wearables [1,20], fashion pieces [30,32], using both 2D [5,13,21] and 3D [3,19,23,28,31,33] visualizations.…”
Section: Emotion Visualizationmentioning
confidence: 99%
“…In most cases, visualizations are static, representing one emotional state at a time, and have contextual associations (i.e. UI elements [5], location and time [13], digital behavior such as tasks [21]). The focus on real-time emotion visualization is limited.…”
Section: Emotion Visualizationmentioning
confidence: 99%
“…However, how affective information should be visualized in AR is not as clear. There is a variety of works on emotion representation in the HCI literature (some examples are [3,31,37]), designed to be integrated with existing applications and 2D interfaces [5], or conceptualized as wearables [1], but they are not targeted for AR. This paper focuses on the design space of expressive emotion visualizations that can appear around one's body as they experience the referenced emotional states.…”
Figure 1: An example visualization generated by the presented online tool featuring silhouettes of 3 people and dots above them representing the group's emotional state, assuming that their emotional states can be detected by their behavior or physiological responses and synchronously visualized.
“…Creating emotion visualizations is a well-investigated research topic in HCI. Researchers have worked with various mediums for presenting visualizations; integrated into apps [5,13,21], wearables [1,20], fashion pieces [30,32], using both 2D [5,13,21] and 3D [3,19,23,28,31,33] visualizations.…”
Section: Emotion Visualizationmentioning
confidence: 99%
“…In most cases, visualizations are static, representing one emotional state at a time, and have contextual associations (i.e. UI elements [5], location and time [13], digital behavior such as tasks [21]). The focus on real-time emotion visualization is limited.…”
Section: Emotion Visualizationmentioning
confidence: 99%
“…However, how affective information should be visualized in AR is not as clear. There is a variety of works on emotion representation in the HCI literature (some examples are [3,31,37]), designed to be integrated with existing applications and 2D interfaces [5], or conceptualized as wearables [1], but they are not targeted for AR. This paper focuses on the design space of expressive emotion visualizations that can appear around one's body as they experience the referenced emotional states.…”
Figure 1: An example visualization generated by the presented online tool featuring silhouettes of 3 people and dots above them representing the group's emotional state, assuming that their emotional states can be detected by their behavior or physiological responses and synchronously visualized.
“…The widgets employed emotion scents, hue-varied colormaps representing either valance or arousal, e.g., red and green represent negative and positive valance, respectively. Emotion-prints was an early system to provided real-time feedback of valance and arousal to users using touch-displays [10]. More recently, Kovacevik et al [47] employed ideas from SAM and emotion scents to create a glyph for simultaneous representation of valence and arousal.…”
Section: Affective Computing In Visualizationmentioning
Fig. 1. Our affective computing visualization provides numerous options for comparing and contrasting the data. For example, the small multiples view (left) is comparing a male (left) subject to a female (right) subject considering different subsets of facial landmarks (columns), across emotions (rows) of anger , disgust, fear , happiness, sadness, and surprise. Each point represents one facial pose. The embedding graph (top right) compares all facial poses across all emotions, in this case showing the female full face using MDS on non-metric topology. The 3D landmarks (lower right) show a single facial pose per emotion. Settings are selected on the bottom.
“…In terms of real-time visualization of emotions, Cernea et al [7] introduced Emotion-prints, a visualization system for touch-enabled interfaces, where the current level of a user's valence and arousal (estimated through EEG measurements) is represented in the shape of a animated halo around the virtual objects that the user touches. Saari et al [29] introduced a mobile emotion visualization system devised for improving group performance and awareness.…”
Figure 1: Overview of our visualization applied to a synthetic data set: each circular glyph corresponds to a single user, its concentric rings encode the baseline and current excitement values, and the nested dots provide the detailed information about time-varying measurement values. Here, the oscillating trail encoding is used for the nested dots.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.