2023
DOI: 10.1109/tnsre.2023.3253866
|View full text |Cite
|
Sign up to set email alerts
|

An Investigation of Olfactory-Enhanced Video on EEG-Based Emotion Recognition

Abstract: Collecting emotional physiological signals is significant in building affective Human-Computer Interactions (HCI). However, how to evoke subjects' emotions efficiently in EEG-related emotional experiments is still a challenge. In this work, we developed a novel experimental paradigm that allows odors dynamically participate in different stages of video-evoked emotions, to investigate the efficiency of olfactory-enhanced videos in inducing subjects' emotions; According to the period that the odors participated … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(10 citation statements)
references
References 51 publications
0
8
0
Order By: Relevance
“…Multiple human senses can be stimulated to develop emotions through the use of audio–visual information employed in multisensory media studies. The examination of facial expressions or neuro-physiological signals has been the primary focus of databases for the research of affect recognition based on visual modalities [ 22 , 23 , 24 , 25 , 26 ]. Yet, despite the fact that eye movements have been shown to be valuable indicators of affective response [ 18 ], few researchers have concentrated on the creation of relevant databases.…”
Section: Eye-tracking Databases For Emotion Recognitionmentioning
confidence: 99%
“…Multiple human senses can be stimulated to develop emotions through the use of audio–visual information employed in multisensory media studies. The examination of facial expressions or neuro-physiological signals has been the primary focus of databases for the research of affect recognition based on visual modalities [ 22 , 23 , 24 , 25 , 26 ]. Yet, despite the fact that eye movements have been shown to be valuable indicators of affective response [ 18 ], few researchers have concentrated on the creation of relevant databases.…”
Section: Eye-tracking Databases For Emotion Recognitionmentioning
confidence: 99%
“…Human emotions can be expressed through speech ( Zhao et al, 2021 ), text ( Wang et al, 2020b ), gestures ( Li Y. -K. et al, 2022 ), and physiological signals ( Wu et al, 2023 ), but facial expressions most intuitively reflect human emotions. The research has found that people would intentionally display certain facial expressions in certain situations, yet when people try to hide their facial expressions in high-stakes situations, it is necessary to interpret facial micro-expressions to determine their true emotional state ( Ekman and Friesen, 1969 ).…”
Section: Introductionmentioning
confidence: 99%
“…Chakravarthi et al ( 2022 ) proposed an automated CNN-LSTM with the ResNet-152 algorithm to identify emotional states from EEG signals. Additionally, Wu et al ( 2023 ) developed a novel experimental paradigm that allows odors dynamically participate in different stages of video-evoked emotions, to investigate the efficiency of olfactory-enhanced videos in inducing subjects' emotions. Thus, understanding the emotions expressed in an utterance requires a comprehensive understanding of various modalities.…”
Section: Introductionmentioning
confidence: 99%