2022
DOI: 10.1038/s41598-022-14808-4
|View full text |Cite|
|
Sign up to set email alerts
|

Classification of emotional states via transdermal cardiovascular spatiotemporal facial patterns using multispectral face videos

Abstract: We describe a new method for remote emotional state assessment using multispectral face videos, and present our findings: unique transdermal, cardiovascular and spatiotemporal facial patterns associated with different emotional states. The method does not rely on stereotypical facial expressions but utilizes different wavelength sensitivities (visible spectrum, near-infrared, and long-wave infrared) to gauge correlates of autonomic nervous system activity spatially and temporally distributed across the human f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
19
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(25 citation statements)
references
References 39 publications
(53 reference statements)
0
19
0
Order By: Relevance
“…Following that, in recent years, research has attempted to develop novel methods for the assessment of a person’s emotional state based on rPPG, mainly from visual cameras (VIS) [ 17 , 18 , 19 , 20 , 21 ]; this approach eliminates the need for uncomfortable sensors and offers the unique advantage of enabling spatial physiological measurement and visualization of the peripheral signals using only one sensor. However, the majority of studies in which PPG signals are extracted from facial video recordings involve averaging all signals from the entire face or specific predefined regions [ 18 , 19 , 20 , 21 ].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Following that, in recent years, research has attempted to develop novel methods for the assessment of a person’s emotional state based on rPPG, mainly from visual cameras (VIS) [ 17 , 18 , 19 , 20 , 21 ]; this approach eliminates the need for uncomfortable sensors and offers the unique advantage of enabling spatial physiological measurement and visualization of the peripheral signals using only one sensor. However, the majority of studies in which PPG signals are extracted from facial video recordings involve averaging all signals from the entire face or specific predefined regions [ 18 , 19 , 20 , 21 ].…”
Section: Introductionmentioning
confidence: 99%
“…Recognizing this limitation, which can be found in a recent work [ 17 ], we proposed a machine learning model to predict emotional states based on physiological changes across the face, which can be related to spatial rPPG methods. This research extracted transdermal spatiotemporal features based on the maximum and minimum values of the spatial rPPG signal obtained from three cameras, which included the visual spectrum range, the near-infrared (NIR) range, and the long-wave infrared (LWIR) thermal range.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Variations in facial expressions result from the conscious and subconscious processing of various internal and external stimuli, including any contextual biases and the interaction between past and present experiences [2]. Patterns of facial muscle movements provide reliable models for automated recognition of human affective states [3]- [5]. Several complex and difficult to identify facial expressions of affective states have also been modelled [5], [6].…”
Section: Introductionmentioning
confidence: 99%
“…Humans understand the dynamic and continuous nature of emotions and affective states using their collective experiences [3]. Given the complexities of emotion elicitation and the psychology behind it, one cannot simply look at emotions as static occurrences in time [5], [7].…”
Section: Introductionmentioning
confidence: 99%