2021
DOI: 10.1101/2021.07.28.21260646
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Leveraging video data from a digital smartphone autism therapy to train an emotion detection classifier

Abstract: Autism spectrum disorder (ASD) is a neurodevelopmental disorder affecting one in 40 children in the United States and is associated with impaired social interactions, restricted interests, and repetitive behaviors. Previous studies have demonstrated the promise of applying mobile systems with real-time emotion recognition to autism therapy, but existing platforms have shown limited performance on videos of children with ASD. We propose the development of a new emotion classifier designed specifically for pedia… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

5
0

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 40 publications
(41 reference statements)
0
6
0
Order By: Relevance
“…Our plan is to develop the STAND app into a versatile tool that can be customized for researching and gathering computer vision data of the face to build personalized machine learning models which can support digital health interventions related to mental states and developmental disorders. This work can integrate with existing research in using automatic emotion recognition for a variety of contexts, including mental illness diagnosis, recognizing human social and physiological interactions, and developing sociable robotics and other human-computer interaction systems [57, [85][86][87][88][89]. For example, emotional expressions have a crucial role in recognizing certain types of developmental disorders.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…Our plan is to develop the STAND app into a versatile tool that can be customized for researching and gathering computer vision data of the face to build personalized machine learning models which can support digital health interventions related to mental states and developmental disorders. This work can integrate with existing research in using automatic emotion recognition for a variety of contexts, including mental illness diagnosis, recognizing human social and physiological interactions, and developing sociable robotics and other human-computer interaction systems [57, [85][86][87][88][89]. For example, emotional expressions have a crucial role in recognizing certain types of developmental disorders.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…Previous work examined the use of crowdsourced annotations for autism, indicating that similar approaches could perhaps be applied through audio [31,[46][47][48][49][50][51]. Audio feature extraction combined with other autism classifiers could be used to create an explainable diagnostic system [52][53][54][55][56][57][58][59][60][61][62][63][64] fit for mobile devices [60]. Previous work investigated using such classifiers to detect autism or approach autism-related tasks like identifying emotion to improve socialization skills; combining computer vision-based quantification of relevant areas of interest, including hand stimming [58], upper limb movement [63], and eye contact [62,64], could possibly result in interpretable models.…”
Section: Future Workmentioning
confidence: 99%
“…sessions with emotion game prompts were demonstrated to be enriched with emotional facial expressions (22). The resulting frames enabled the construction of models that outperformed general-purpose facial emotion recognition classifiers (23)(24)(25)(26). Beyond its value for digital phenotyping, Guess What?…”
Section: Data Modalities and Acquisitionmentioning
confidence: 99%