“…This limits the use of these methods, for example, in systems controlling the interaction of a person with technical objects. The second group of methods is based on behavioral reactions assessed by facial features, such as mouth activity, head movements, blink frequency, spatial distribution of gaze, pupil dilation, and eye movements [ 20 , 21 , 22 , 23 ]; voice [ 24 , 25 , 26 ]; and movements, gait, and body postures [ 27 , 28 , 29 ]. Currently, research on emotion recognition has mainly focused on facial expression and physiological cues, while emotion recognition based on the modality of posture has been investigated little.…”