Proceedings of the 5th International Conference on Human Agent Interaction 2017
DOI: 10.1145/3125739.3125772
|View full text |Cite
|
Sign up to set email alerts
|

Emotion Recognition from Body Expressions with a Neural Network Architecture

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 24 publications
(10 citation statements)
references
References 21 publications
0
10
0
Order By: Relevance
“…An interesting approach was proposed in Elfaramawy et al ( 2017 ), where neural network architecture was designed for classifying 6 emotions from body motion patterns. In particular, the classification was performed by grow when required (GWR) networks, self-organizing architectures able to grow nodes whenever the network does not sufficiently match the input.…”
Section: State Of the Artmentioning
confidence: 99%
“…An interesting approach was proposed in Elfaramawy et al ( 2017 ), where neural network architecture was designed for classifying 6 emotions from body motion patterns. In particular, the classification was performed by grow when required (GWR) networks, self-organizing architectures able to grow nodes whenever the network does not sufficiently match the input.…”
Section: State Of the Artmentioning
confidence: 99%
“…The neural update rate decreases as the neurons become more habituated, which has the effect of preventing that noisy input interferes with consolidated neural representations. Similar GWR-based approaches have been proposed for the incremental learning of body motion patterns [54,8,61] and human-object interaction [55].…”
Section: Growing Self-organizing Networkmentioning
confidence: 99%
“…There is evidence demonstrating that emotional states (e.g., happiness, sadness, anger) [72][73][74] and internal states such as engagement [75] can be recognised from gesture and posture information collected through standard digital video devices. Similarly, body postures captured using the Microsoft Xbox Kinect device were successfully used to classify emotional states [76].…”
Section: Intention Recognition In Social Roboticsmentioning
confidence: 99%
“…Ramey and colleagues [78] for example, integrated the Kinect device into a social robot for tracking and recognising hand gestures. Similarly, Elfaramawy and colleagues [74] mounted a depth sensor onto a Nao robot to record movement data during an interaction with human users. This data was then used to classify whether the interaction partner was expressing the emotions anger, fear, happiness, sadness or surprise.…”
Section: Intention Recognition In Social Roboticsmentioning
confidence: 99%