Face and Gesture 2011 2011
DOI: 10.1109/fg.2011.5771462
|View full text |Cite
|
Sign up to set email alerts
|

Painful data: The UNBC-McMaster shoulder pain expression archive database

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
540
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 480 publications
(572 citation statements)
references
References 16 publications
2
540
0
Order By: Relevance
“…As input features we used locations of 66 facial landmarks (see Fig.2) provided by the database creators, and obtained using a 2D Active Appearance Model (2D-AAM) [27]. In the pre-processing step, the facial points were aligned to the corresponding reference face (the average face from the dataset) by applying an affine transform.…”
Section: Methodsmentioning
confidence: 99%
“…As input features we used locations of 66 facial landmarks (see Fig.2) provided by the database creators, and obtained using a 2D Active Appearance Model (2D-AAM) [27]. In the pre-processing step, the facial points were aligned to the corresponding reference face (the average face from the dataset) by applying an affine transform.…”
Section: Methodsmentioning
confidence: 99%
“…Recent release of the pain-intensity coded data (Lucey et al 2011) has motivated research into automated estimation of the pain intensity levels (Hammal & Cohn 2012, Kaltwang et al 2012, Rudovic et al 2013a). For example, (Hammal & Cohn 2012) performed estimation of 4 pain intensity levels, with the levels greater than 3 on the 16-level scale being grouped together.…”
Section: Intensity Estimation Of Facial Expressionsmentioning
confidence: 99%
“…The frame-based methods for classification of facial expressions of six-basic emotion categories (Ekman et al 2002) typically employ static classifiers such as rule-based classifiers (Pantic & Rothkrantz 2004, Black & Yacoob 1997, Neural Networks (NN) (Padgett & Cottrell 1996, Tian 2004, Support Vector Machine (SVM) (Bartlett et al 2005, Shan et al 2009), and Bayesian Networks (BN) (Cohen et al 2003). SVMs and its probabilistic counterpart, Relevance Vector Machine (RVM), have been used for classification of facial expressions of pain (Lucey et al 2011, Gholami et al 2009). For instance, (Lucey et al 2011) addressed the problem of pain detection by applying SVMs either directly to the image features or by applying a two-step approach, where AUs were first detected using SVMs, the outputs of which were then fused using the Logistic Regression model.…”
Section: Facial Expression Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…The latter are based on Prkachin and Solomons Pain Intensity Metric [11]. Ordinal pain scores are computed as the sum of specific AU intensities or directly from facial features [12]. These approaches all seek to detect ordinal level intensity.…”
Section: Introductionmentioning
confidence: 99%