This paper discusses how an interactive artwork, the Crowd-Sourced Intelligence Agency (CSIA), can contribute to discussions of Big Data intelligence analytics. The CSIA is a publicly accessible Open Source Intelligence (OSINT) system that was constructed using information gathered from technical manuals, research reports, academic papers, leaked documents, and Freedom of Information Act files. Using a visceral heuristic, the CSIA demonstrates how the statistical correlations made by automated classification systems are different from human judgment and can produce falsepositives, as well as how the display of information through an interface can affect the judgment of an intelligence agent. The public has the right to ask questions about how a computer program determines if they are a threat to national security and to question the practicality of using statistical pattern recognition algorithms in place of human judgment. Currently, the public's lack of access to both Big Data and the actual datasets intelligence agencies use to train their classification algorithms keeps the possibility of performing effective sous-dataveillance out of reach. Without this data, the results returned by the CSIA will not be identical to those of intelligence agencies. Because we have replicated how OSINT is processed, however, our results will resemble the type of results and mistakes made by OSINT systems. The CSIA takes some initial steps toward contributing to an informed public debate about large-scale monitoring of open source, social media data and provides a prototype for counterveillance and sousveillance tools for citizens.
Going Viral is an interactive artwork that invites people to intervene in the spreading of misinformation by sharing informational videos about COVID-19 that feature algorithmically generated celebrities, social media influencers, and politicians that have made or shared claims about the coronavirus that are count- er to the official consensus of healthcare professionals and were categorized as misinformation. In the videos, algorithmically-generated speakers deliver public service announcements or present news stories that counter the misinformation they had previously promoted on social media. The shareable YouTube videos present a recognizable, but glitchy, reconstruction of the celebrities. The obvi- ous digital fabrication of the videos prevents their classification as deepfakes by content moderators and helps viewers reflect on the authority of celebrities on issues of public health and the validity of information shared on social media.
Boogaloo Bias is an interactive artwork and research project that addresses some of the known problems with the unregulated use of facial recognition technologies, including the practice of ‘brute forcing’ where, in the absence of high-quality images of a suspect, law enforcement agents have been known to substitute images of celebrities the suspect is reported to resemble. To lam- poon this approach, the Boogaloo Bias facial recognition algorithm searches for members of the anti-law enforcement militia, the Boogaloo Bois, using a fa- cial recognition algorithm trained on faces of characters from the 1984 movie Breakin’ 2: Electric Boogaloo. The film is the namesake for the Boogaloo Bois, an anti-law enforcement militia that emerged from 4chan meme culture. They have been present at both right and left-wing protests in the US since Janu- ary 2020. The system is used to search live video feeds, protest footage, and images that are uploaded to the Boogaloo Bias website. All matches made by the system are false positives. No information from the live feeds or website uploads is saved or shared.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.