2021
DOI: 10.1007/s10055-021-00506-5
|View full text |Cite
|
Sign up to set email alerts
|

Automatic detection and classification of emotional states in virtual reality and standard environments (LCD): comparing valence and arousal of induced emotions

Abstract: The following case study was carried out on a sample of one experimental and one control group. The participants of the experimental group watched the movie section from the standardized LATEMO-E database via virtual reality (VR) on Oculus Rift S and HTC Vive Pro devices. In the control group, the movie section was displayed on the LCD monitor. The movie section was categorized according to Ekman's and Russell's classification model of evoking an emotional state. The range of valence and arousal was determined… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(4 citation statements)
references
References 56 publications
1
3
0
Order By: Relevance
“…When categorized by color, there were some negative words like "prison" for the black virtual environment, "horror" for the red virtual environment, and "depressed" for the green virtual environment. This kind of negative association observed with certain ambient colors may be in line with some recent studies that have explored negative emotions in VR (Lavoie et al, 2021;Magdin et al, 2021). These studies showed that virtual environments involving a higher level of absorption may increase negative emotional responses.…”
Section: Figure 10supporting
confidence: 88%
“…When categorized by color, there were some negative words like "prison" for the black virtual environment, "horror" for the red virtual environment, and "depressed" for the green virtual environment. This kind of negative association observed with certain ambient colors may be in line with some recent studies that have explored negative emotions in VR (Lavoie et al, 2021;Magdin et al, 2021). These studies showed that virtual environments involving a higher level of absorption may increase negative emotional responses.…”
Section: Figure 10supporting
confidence: 88%
“…Non-invasive psychophysical tools provide an opportunity for researchers to measure a person’s emotional states [ 17 ], attention [ 18 ] or cognitive load in real time. Recently, affordable and high-resolution eye movement tracking devices have been developed to record eye movement parameters and help the eye-tracking-based research [ 19 ].…”
Section: Theoretical Backgroundmentioning
confidence: 99%
“…Data collection could include the use of cameras within the headset and outside to capture facial expressions potentially including eye-tracking, movements of the body, and any body language cues [16], [18]. Other data collection could include an EEG sensor to capture brain waves, and other sensors collecting the temperature of the skin, galvanic skin response (GSR), speech, eyetracking, heart rate [13], [18], [19], [20]. Speech can be captured and transmitted but there can be patterns in the way things are being said that could give a clearer picture of the emotional state of the spoken part.…”
Section: Our Approachmentioning
confidence: 99%
“…Speech can be captured and transmitted but there can be patterns in the way things are being said that could give a clearer picture of the emotional state of the spoken part. We will use techniques from machine learning to process this data that we collect on both parties and recreate each party visually (avatar) as well as the feelings associated with touch [18], [22]. The idea will be to produce an avatar that matches the emotional state of the person the data was collected from without any input from the person being captured.…”
Section: Our Approachmentioning
confidence: 99%