2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2017
DOI: 10.1109/icassp.2017.7953119
|View full text |Cite
|
Sign up to set email alerts
|

Engagement detection for children with Autism Spectrum Disorder

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
24
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(26 citation statements)
references
References 14 publications
0
24
0
1
Order By: Relevance
“…This may particularly be challenging due to the large individual and cultural heterogeneity in image data of this population. Also, most of existing works on analysis of facial cues in autism focus on eye-gaze, blinking, and head-pose [12], [29], which are shown to be a good proxy of joint attention and engagement -the lack of which is pertinent to ASC. Extracting these cues from face images is usually done using detectors specifically built for each facial cue.…”
Section: Introductionmentioning
confidence: 99%
“…This may particularly be challenging due to the large individual and cultural heterogeneity in image data of this population. Also, most of existing works on analysis of facial cues in autism focus on eye-gaze, blinking, and head-pose [12], [29], which are shown to be a good proxy of joint attention and engagement -the lack of which is pertinent to ASC. Extracting these cues from face images is usually done using detectors specifically built for each facial cue.…”
Section: Introductionmentioning
confidence: 99%
“…[14][15][16][17][18][19] Chorianopoulou et al collected structured home videos from participants and had expert annotators label the dataset with the actions, emotions, gaze fixations, utterances, and overall level of engagement in each video; this information was then used to train a classifier to identify specific engagement features that could be correlated with ASD. 20 Rudovic et al trained a large and generalizable neural network to estimate engagement in children with ASD from different cultural backgrounds. 21 Engagement labels were manually annotated by trained individuals.…”
Section: Manual Annotation Methodsmentioning
confidence: 99%
“…In [26], wearable sensors were used to measure the electrodermal activity of the children and a Support Vector Machine (SVM) classifier was applied to classify the children being engaged or not. In [27], acoustic and linguistic data were utilized to detect the social engagement in conversational interactions of children with ASD and their parents, using an SVM classifier. e first indepth study of measuring the engagement of children when interacting with social robots was proposed by Anzalone et al [28].…”
Section: Related Workmentioning
confidence: 99%