2022 ACM Conference on Fairness, Accountability, and Transparency 2022
DOI: 10.1145/3531146.3533192
|View full text |Cite
|
Sign up to set email alerts
|

Can Machines Help Us Answering Question 16 in Datasheets, and In Turn Reflecting on Inappropriate Content?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(23 citation statements)
references
References 30 publications
1
22
0
Order By: Relevance
“…We apply Q16 [68] and our own specialized pornographic and sexualized content classifier (here referred to as NSFW) to identify and document a broad range of inappropriate concepts displaying not only persons but also objects, symbols, and text, see cf. [68] and Appendix Sec. C.5 and Sec.…”
Section: Safety During Collectionmentioning
confidence: 99%
See 2 more Smart Citations
“…We apply Q16 [68] and our own specialized pornographic and sexualized content classifier (here referred to as NSFW) to identify and document a broad range of inappropriate concepts displaying not only persons but also objects, symbols, and text, see cf. [68] and Appendix Sec. C.5 and Sec.…”
Section: Safety During Collectionmentioning
confidence: 99%
“…Further, we used the Q16 documentation pipeline [68] to document the broad range of identified potentially inappropriate concepts contained, cf. Sec.…”
Section: C6 Further Inappropriate Content Taggingmentioning
confidence: 99%
See 1 more Smart Citation
“…SD's post-hoc safety measures. Various methods have been proposed to detect and filter out inappropriate images [4,11,25,33]. Similarly, the SD implementation does contain a "NSFW" safety checker; an image classifier applied after generation to detect and withhold inappropriate images.…”
Section: Risks and Promises Of Unfiltered Datamentioning
confidence: 99%
“…Next to the prompts, our framework includes three dedicated inappropriateness detectors. Namely, SD's built-in safety checker, the Q16 classifier [33], and an explicit nudity classifier, the NudeNet mentioned earlier.…”
Section: Inappropriate Image Prompts (I2p)mentioning
confidence: 99%