2017
DOI: 10.1109/taffc.2016.2515084
|View full text |Cite
|
Sign up to set email alerts
|

Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
158
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 256 publications
(159 citation statements)
references
References 62 publications
1
158
0
Order By: Relevance
“…[8]. Automated measures are based on the timing and participants' physiological information produced by heart, brain and skin [19,23]. Another kind of automatic engagement recognition is based on computer vision, which provides an automatic estimation of engagement by analysing cues from the face and gestures [5,7,19,23].…”
Section: Methodsmentioning
confidence: 99%
“…[8]. Automated measures are based on the timing and participants' physiological information produced by heart, brain and skin [19,23]. Another kind of automatic engagement recognition is based on computer vision, which provides an automatic estimation of engagement by analysing cues from the face and gestures [5,7,19,23].…”
Section: Methodsmentioning
confidence: 99%
“…Non-intrusive visual observation and estimation of affective parameters is commonly using recorded video (RGB) signal, for example, to estimate student engagement from facial expressions [3,17], to estimate mood of children during one-to-one tutoring by using facial analysis [2], and to estimate driver's vigilance from his head pose [18]. A survey of automatic affect detection methods [4] identified various types of signals (video, EKG, EMG...) used in affect analysis.…”
Section: Automated Measurement Of Affective Parametersmentioning
confidence: 99%
“…Monkaresi et al [17] use combination of geometrical facial features (detected by Kinect sensor), texture description features (local binary patterns), and physiological features (heart rate) to estimate two-level engagement of students. Whitehill et al [3] use computer vision methods to register faces and extract Box Filter features (Haar wavelets) and then train binary classifiers to estimate four states of engagement.…”
Section: Automated Measurement Of Affective Parametersmentioning
confidence: 99%
See 2 more Smart Citations