2020
DOI: 10.3390/s21010052
|View full text |Cite
|
Sign up to set email alerts
|

CorrNet: Fine-Grained Emotion Recognition for Video Watching Using Wearable Physiological Sensors

Abstract: Recognizing user emotions while they watch short-form videos anytime and anywhere is essential for facilitating video content customization and personalization. However, most works either classify a single emotion per video stimuli, or are restricted to static, desktop environments. To address this, we propose a correlation-based emotion recognition algorithm (CorrNet) to recognize the valence and arousal (V-A) of each instance (fine-grained segment of signals) using only wearable, physiological signals (e.g.,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
41
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 40 publications
(43 citation statements)
references
References 109 publications
(143 reference statements)
1
41
0
1
Order By: Relevance
“…Out of the studies in this research area, autoencoder is a widely used technique for unsupervised representation learning. Recently published CorrNet [16] uses autoencoder based automatic feature extraction in a wearable signal-based emotion recognition task and outperforms the state-of-the-art baseline for CASE dataset for arousal (74.03%) and valence (76.37%) detection. Martinez et al [13] deploy a denoising autoencoder network to learn features from blood volume pulse (BVP) and electrodermal activity (EDA) signals and reuse the learned features to classify affective states.…”
Section: A Feature Engineering and Representation Learning For Emotio...mentioning
confidence: 99%
See 2 more Smart Citations
“…Out of the studies in this research area, autoencoder is a widely used technique for unsupervised representation learning. Recently published CorrNet [16] uses autoencoder based automatic feature extraction in a wearable signal-based emotion recognition task and outperforms the state-of-the-art baseline for CASE dataset for arousal (74.03%) and valence (76.37%) detection. Martinez et al [13] deploy a denoising autoencoder network to learn features from blood volume pulse (BVP) and electrodermal activity (EDA) signals and reuse the learned features to classify affective states.…”
Section: A Feature Engineering and Representation Learning For Emotio...mentioning
confidence: 99%
“…However, in the literature [16], researchers have binned nine levels into two class and three class configurations for evaluation. For the comparison purpose, we follow the class configuration proposed by Zhang et al [16] 1 https:// www.empatica.com/ research/ e4/ 2 https:// wearabletech.io/ zephyr-bioharness-3/ in our evaluations. We use the CASE for signal representation learning as well as the emotion recognition tasks.…”
Section: Experimental Setup a Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…Nowadays, there are even systems that perform the analysis of biomarkers from sweat and saliva directly during sports activities [ 211 ]. The current literature summary of stress assessment using wearable multi-sensors in the natural environment includes: emotion recognition by neural networks from portable eyetracker and Empatica E4 [ 212 ], ANS research using again E4 but now with ECG and respiration sensors [ 213 ], development of cognitive load tracker using machine learning [ 109 ], smart stress reduction system using E4 combined with accelerometers [ 214 ], validation of wireless sensors for psychophysiological studies and stress detection [ 100 , 215 ], prediction of relative physical activity [ 216 ], real-time monitoring of passenger psychological stress [ 147 ], classification of calm/distress condition [ 217 ], assessment of mental stress of fighters [ 218 ], and others. A comprehensive overview about pain and stress detection using available wearable sensors was actually made very recently by Jerry Chen et al [ 150 ].…”
Section: Advanced Wearable Stress-metersmentioning
confidence: 99%
“…These responses can be measured using physiological sensors, thus giving us an objective window into the realm of emotions. In recent years, advances in Deep Learning (DL) techniques has enabled researchers to directly model the mappings between physiological signals (e.g., galvanic skin response (GSR) and electrocardiogram (ECG)) and human emotions [5,6], without resorting to crafting features manually that require expert knowledge.…”
Section: Introductionmentioning
confidence: 99%