2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020) 2020
DOI: 10.1109/fg47880.2020.00078
|View full text |Cite
|
Sign up to set email alerts
|

EMOPAIN Challenge 2020: Multimodal Pain Evaluation from Facial and Bodily Expressions

Abstract: The EmoPain 2020 Challenge is the first international competition aimed at creating a uniform platform for the comparison of multi-modal machine learning and multimedia processing methods of chronic pain assessment from human expressive behaviour, and also the identification of pain-related behaviours. The objective of the challenge is to promote research in the development of assistive technologies that help improve the quality of life for people with chronic pain via real-time monitoring and feedback to help… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(9 citation statements)
references
References 56 publications
0
7
1
Order By: Relevance
“…Nevertheless, the C2 and C3 configurations surpass the baseline in terms of PCC. Our results are not directly comparable to those reported by the Challenge organisers [8], as we have evaluated different features. Nevertheless, our results do not beat the best baseline system of the Challenge organisers, which achieved a CCC of 0.180 using the validation partition.…”
Section: Resultscontrasting
confidence: 93%
See 2 more Smart Citations
“…Nevertheless, the C2 and C3 configurations surpass the baseline in terms of PCC. Our results are not directly comparable to those reported by the Challenge organisers [8], as we have evaluated different features. Nevertheless, our results do not beat the best baseline system of the Challenge organisers, which achieved a CCC of 0.180 using the validation partition.…”
Section: Resultscontrasting
confidence: 93%
“…For the Challenge [8], only the features extracted from the facial videos have been made available to participants. The available facial features include facial landmarks, head pose, Histogram of Oriented Gradient (HOG) features, action unit intensity values and occurrence extracted with OpenFace [3], and deep-learnt feature representations extracted using VGG-16 [23] and ResNet-50 [14] pre-trained models.…”
Section: Emopain Datasetmentioning
confidence: 99%
See 1 more Smart Citation
“…Although in everyday pain experience we encounter associations between body movement and pain, the communicative functions of body movements in relation to pain have been fairly unexplored in automatic pain assessment. Notable exceptions are to be found in the work of Aung et al [(4), see also Egede et al (5)] who found association between pain and certain bodily protective behaviors, such as guarding/stiffness and bracing/support.…”
Section: Introductionmentioning
confidence: 99%
“…The main advantage of using facial expression to detect the pain levels is to reduce the distress caused by recording the brain activities or other physiological signals such as electrocardiogram ECG and electromyogram EMG that required sensors to be contacted directly to patient"s body/skin (Kunz and Lautenbacher, 2019). An international competition aimed to create a platform for the comparison of multimedia processing methods of chronic pain assessment form human behavior and multi-model machine learning is conducted by the EmoPain 2020 challenge team (Egede et al, 2020).…”
Section: Introductionmentioning
confidence: 99%