2021
DOI: 10.1038/s41598-021-87059-4
|View full text |Cite
|
Sign up to set email alerts
|

Crowdsourced privacy-preserved feature tagging of short home videos for machine learning ASD detection

Abstract: Standard medical diagnosis of mental health conditions requires licensed experts who are increasingly outnumbered by those at risk, limiting reach. We test the hypothesis that a trustworthy crowd of non-experts can efficiently annotate behavioral features needed for accurate machine learning detection of the common childhood developmental disorder Autism Spectrum Disorder (ASD) for children under 8 years old. We implement a novel process for identifying and certifying a trustworthy distributed workforce for vi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2

Relationship

3
5

Authors

Journals

citations
Cited by 28 publications
(18 citation statements)
references
References 55 publications
1
17
0
Order By: Relevance
“…In addition, these techniques may compromise the privacy of participants by providing annotators with access to video footage, although some methods have been developed to address privacy concerns with crowdsourced annotations. 22,23…”
Section: Manual Annotation Methodsmentioning
confidence: 99%
“…In addition, these techniques may compromise the privacy of participants by providing annotators with access to video footage, although some methods have been developed to address privacy concerns with crowdsourced annotations. 22,23…”
Section: Manual Annotation Methodsmentioning
confidence: 99%
“…1,000 crowd workers completed the task and passed basic quality control checks for answer acceptance. Quality control measures included time spent on the annotation task and deviations in answers between videos [52]. We did not include workers who did not pass these basic quality control checks in the list of 1,000 evaluated workers.…”
Section: Crowd Annotationmentioning
confidence: 99%
“…This is important in general for healthcare applications, but it is especially critical when the patients in question are young children with a developmental delay and observed by a stranger in the privacy of their home. Our prior work has shown that applying light privacy-preserving modifications to videos, such as pitch shifting and covering the child's face with a virtual box, results in minimal degradation of the quality of crowdsourced annotations used for remote detection of autism-related behaviors [52]. However, such lightweight privacy protections may be insufficient to some patients, such as those patients who do not want the interior of their home exposed to strangers on the Internet.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Although these methods enable the creation of human-vetted, accurate data sets, such approaches require large numbers of trained annotators when implemented on a large scale, which is expensive and time-consuming. In addition, these techniques may compromise the privacy of participants by providing annotators with access to video footage, although some methods have been developed to address privacy concerns with crowdsourced annotations [ 23 , 24 ].…”
Section: Introductionmentioning
confidence: 99%