2022
DOI: 10.1109/tcss.2021.3127935
|View full text |Cite
|
Sign up to set email alerts
|

CogEmoNet: A Cognitive-Feature-Augmented Driver Emotion Recognition Model for Smart Cockpit

Abstract: Driver's emotion recognition is vital to improving driving safety, comfort, and acceptance of intelligent vehicles. This paper presents a cognitive-feature-augmented driver emotion detection method which is based on emotional cognitive process theory and deep networks. Different from the traditional methods, both the driver's facial expression and cognitive process characteristics (age, gender, and driving age) were used as the inputs of the proposed model. Convolutional techniques were adopted to construct th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2

Relationship

4
4

Authors

Journals

citations
Cited by 54 publications
(17 citation statements)
references
References 46 publications
0
10
0
Order By: Relevance
“…The emotions of human drivers need to be induced by appropriate stimuli to collect emotion data. Video-audio clips have been proven to reliably trigger the emotions of human driver 6 , 34 , 35 . Based on the results of the questionnaire survey, we manually selected the seven most effective (the highest percentage of participants were selected in each emotional scenario) video-audio clips on the Bilibili website ( https://www.bilibili.com/ ) to induce the corresponding emotions of the human driver in Experiment III.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The emotions of human drivers need to be induced by appropriate stimuli to collect emotion data. Video-audio clips have been proven to reliably trigger the emotions of human driver 6 , 34 , 35 . Based on the results of the questionnaire survey, we manually selected the seven most effective (the highest percentage of participants were selected in each emotional scenario) video-audio clips on the Bilibili website ( https://www.bilibili.com/ ) to induce the corresponding emotions of the human driver in Experiment III.…”
Section: Methodsmentioning
confidence: 99%
“…Moreover, by applying various machine learning techniques, based on the collected driving behaviour, EEG, facial expressions, driving posture, and road scene information, the dataset can be used to develop single-modal/multi-modal driver emotion monitoring algorithms 35 , 50 . Accurate and efficient emotion monitoring algorithms will help the emotion-aware interaction between human drivers and intelligent vehicles to improve driving safety and comfort, and increase human trust in machines 46 , 51 .…”
Section: Usage Notesmentioning
confidence: 99%
“…In the intelligent cockpit, the perception, cognition, and decision of the driver's anger, as well as the regulation of the driver's anger through different interaction methods are an effective way to reduce the driving risk caused by anger and thereby improve road traffic safety. Benefiting from the rapid development of technologies, such as artificial intelligence, existing research has made great progress in driver anger detection and different interaction technologies development [34], [35], [36]. However, few studies have focused on what decisions the intelligent cockpit should make after recognizing anger, and generating corresponding strategies to regulate driver anger effectively.…”
Section: ) Intelligent Cockpit Perceptionmentioning
confidence: 99%
“…1) Anger Induction Material: Previous research has found that video clips are an effective method for driver emotion induction [33], [36], [58]. Therefore, selected video clips were played to stimulate the participants' anger.…”
Section: B Materials and Apparatusmentioning
confidence: 99%
“…Wenbo et al [54] investigated the driver anger regulation in visual attributes. Driver facial expression mirrors driver emotional state [55]; therefore, it is important for emotion recognition and ensuring driving safety.…”
Section: Driver Facial Expression Recognitionmentioning
confidence: 99%