2019
DOI: 10.3390/s19245516
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments

Abstract: The present work tries to fill part of the gap regarding the pilots' emotions and their bio-reactions during some flight procedures such as, takeoff, climbing, cruising, descent, initial approach, final approach and landing. A sensing architecture and a set of experiments were developed, associating it to several simulated flights (N f lights = 13) using the Microsoft Flight Simulator Steam Edition (FSX-SE). The approach was carried out with eight beginner users on the flight simulator (N pilots = 8). It is sh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 19 publications
(14 citation statements)
references
References 34 publications
0
14
0
Order By: Relevance
“…The authors in [ 25 ] proposed a multimodal approach to emotion recognition in the aviation domain with the goal of filling some of the gap between pilots’ emotions and their bioreactions during flight procedures such as take-off, climbing, cruising, descent, initial approach, final approach and landing. Building around a sensing architecture and a set of simulated flight experiments, the study showed that it was indeed possible to recognize emotions from different pilots in flight, combining their present and previous emotions.…”
Section: Facial Expression Recognitionmentioning
confidence: 99%
“…The authors in [ 25 ] proposed a multimodal approach to emotion recognition in the aviation domain with the goal of filling some of the gap between pilots’ emotions and their bioreactions during flight procedures such as take-off, climbing, cruising, descent, initial approach, final approach and landing. Building around a sensing architecture and a set of simulated flight experiments, the study showed that it was indeed possible to recognize emotions from different pilots in flight, combining their present and previous emotions.…”
Section: Facial Expression Recognitionmentioning
confidence: 99%
“…The setup is described in Figure 2, of which the EEG context (the biosignal analyzed in this present work) represents part of the proposed multisensing architecture described in [7]. In addition, this final setup was the improvement of two different proof of concepts (PoCs) [5], resulting in several changes: a large screen to improve the immersive experience during the simulation, maintaining an average distance of 1.70 to 1.90 m of the volunteer; a computer to run the flight simulator and to record facial expression; the use of only one hand to control the aircraft by the joystick, due to the GSR sensors being placed on another hand; a headcap with dry electrodes that was used to acquire EEG data; the GSR electrodes placed on the free hand, i.e., without movements to avoid motion artifacts; a microcontroller (e.g., Arduino board) used to acquire the HR data of the HR device (Medlab P100); the supervisor using two computers, one to receive HR and GSR data of Bluetooth communication, and another to receive the Bluetooth data of EEG device; and an auxiliary or additional camera, used to record the volunteers' body gestures.…”
Section: Flight Experiments Descriptionmentioning
confidence: 99%
“…These 13 datasets include the emotion questionnaires, face recordings, HR, GSR and EEG data. The dataset names are a sequence of two letters and one number to indicate the volunteer name and the flight sequence of such volunteer, respectively, [5][6][7].…”
Section: Dataset Descriptionmentioning
confidence: 99%
See 2 more Smart Citations