Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems 2021
DOI: 10.1145/3411763.3451701
|View full text |Cite
|
Sign up to set email alerts
|

Activity Recognition with Moving Cameras and Few Training Examples: Applications for Detection of Autism-Related Headbanging

Abstract: Activity recognition computer vision algorithms can be used to detect the presence of autism-related behaviors, including what are termed "restricted and repetitive behaviors", or stimming, by diagnostic instruments. Examples of stimming include hand fapping, spinning, and head banging. One of the most signifcant bottlenecks for implementing such classifers is the lack of sufciently large training sets of human behavior specifc to pediatric developmental delays. The data that do exist are usually recorded with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 42 publications
(27 citation statements)
references
References 49 publications
1
26
0
Order By: Relevance
“…Previous work examined the use of crowdsourced annotations for autism, indicating that similar approaches could perhaps be applied through audio [31,[46][47][48][49][50][51]. Audio feature extraction combined with other autism classifiers could be used to create an explainable diagnostic system [52][53][54][55][56][57][58][59][60][61][62][63][64] fit for mobile devices [60]. Previous work investigated using such classifiers to detect autism or approach autism-related tasks like identifying emotion to improve socialization skills; combining computer vision-based quantification of relevant areas of interest, including hand stimming [58], upper limb movement [63], and eye contact [62,64], could possibly result in interpretable models.…”
Section: Future Workmentioning
confidence: 99%
“…Previous work examined the use of crowdsourced annotations for autism, indicating that similar approaches could perhaps be applied through audio [31,[46][47][48][49][50][51]. Audio feature extraction combined with other autism classifiers could be used to create an explainable diagnostic system [52][53][54][55][56][57][58][59][60][61][62][63][64] fit for mobile devices [60]. Previous work investigated using such classifiers to detect autism or approach autism-related tasks like identifying emotion to improve socialization skills; combining computer vision-based quantification of relevant areas of interest, including hand stimming [58], upper limb movement [63], and eye contact [62,64], could possibly result in interpretable models.…”
Section: Future Workmentioning
confidence: 99%
“…[15] and Washington et.al. [16] have modelled arm flapping and head banging actions respectively from Self-Stimulatory Behaviour Dataset (SSBD) dataset. Lakkapragada et.al.…”
Section: Related Workmentioning
confidence: 99%
“…Washington et.al. [16] used head keypoints and time distributed CNN integrated with LSTM to detect head banging.…”
Section: Related Workmentioning
confidence: 99%
“…Interventions such as Quokka provide a mechanism for eliciting behavior change from distributed participants. To optimize the provided interventions, direct measurement of behavior changes via machine learning [69][70][71][72][73][74][75][76] along with self-reported questionnaires can generate useful multimodal data sets. Feature selection approaches could be applied to such data streams to identify salient behavioral markers [77][78][79][80][81] of mental health, and classifiers for these could be realized via trustworthy and reliable crowdsourced labeling of the incoming data [82][83][84][85][86][87].…”
Section: Opportunities For Future Workmentioning
confidence: 99%