2022
DOI: 10.48550/arxiv.2204.14110
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Seeing without Looking: Analysis Pipeline for Child Sexual Abuse Datasets

Abstract: The online sharing and viewing of Child Sexual Abuse Material (CSAM) are growing fast, such that human experts can no longer handle the manual inspection. However, the automatic classification of CSAM is a challenging field of research, largely due to the inaccessibility of target data that is -and should forever be -private and in sole possession of law enforcement agencies. To aid researchers in drawing insights from unseen data and safely providing further understanding of CSAM images, we propose an analysi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 30 publications
0
0
0
Order By: Relevance
“…Visual analysis cannot determine consent, lacks accuracy determining age and essentializes complex identity expressions for race and gender (Lee et al 2020;Scheuerman et al 2021). Ethical concerns with CSAM training data abound, and generalised datasets reduce accuracy for marginalised subjects, meaning hegemonic white, cis and able bodied content is more accurately identified (Laranjeira et al 2022). Non-white children are less proximate to rescue from CSAM, and non-white content more prone to errors, thus establishing 'digital redlining' in detection systems (Thakor 2018;Tusikov 2021).…”
Section: Issue Discussion and Conclusionmentioning
confidence: 99%
“…Visual analysis cannot determine consent, lacks accuracy determining age and essentializes complex identity expressions for race and gender (Lee et al 2020;Scheuerman et al 2021). Ethical concerns with CSAM training data abound, and generalised datasets reduce accuracy for marginalised subjects, meaning hegemonic white, cis and able bodied content is more accurately identified (Laranjeira et al 2022). Non-white children are less proximate to rescue from CSAM, and non-white content more prone to errors, thus establishing 'digital redlining' in detection systems (Thakor 2018;Tusikov 2021).…”
Section: Issue Discussion and Conclusionmentioning
confidence: 99%