Purpose
The purpose of this paper is to propose that in order to tackle the question of bias in algorithms, a systemic, sociotechnical and holistic perspective is needed. With reference to the term “algorithmic culture,” the interconnectedness and mutual shaping of society and technology are postulated. A sociotechnical approach requires translational work between and across disciplines. This conceptual paper undertakes such translational work. It exemplifies how gender and diversity studies, by bringing in expertise on addressing bias and structural inequalities, provide a crucial source for analyzing and mitigating bias in algorithmic systems.
Design/methodology/approach
After introducing the sociotechnical context, an overview is provided regarding the contemporary discourse around bias in algorithms, debates around algorithmic culture, knowledge production and bias identification as well as common solutions. The key concepts of gender studies (situated knowledges and strong objectivity) and concrete examples of gender bias then serve as a backdrop for revisiting contemporary debates.
Findings
The key concepts reframe the discourse on bias and concepts such as algorithmic fairness and transparency by contextualizing and situating them. The paper includes specific suggestions for researchers and practitioners on how to account for social inequalities in the design of algorithmic systems.
Originality/value
A systemic, gender-informed approach for addressing the issue is provided, and a concrete, applicable methodology toward a situated understanding of algorithmic bias is laid out, providing an important contribution for an urgent multidisciplinary dialogue.
This intra-view follows a round-table discussion that took place during the New Materialist Informatics conference on 25 March 2021. The discussants – Indigenous researcher and game designer Outi Laiti, artists and researchers Luiza Prado de O. Martins, Femke Snelting and Caroline Ward – start with their own artistic, academic, and creative practices and discuss how these practices relate to otherwise-worldings in computing that engage materialist, anti-racist, decolonial, Indigenous, and trans*feminist thinking and doing. This discussion, facilitated by artist Ren Loren Britton and researcher Goda Klumbytė, brings up questions of collaboration and infrastructures needed to support otherwise practices in computing and design.
This article discusses the role that algorithmic thinking and management play in health care and the kind of exclusions this might create. We argue that evidence‐based medicine relies on research and data to create pathways for patient journeys. Coupled with data‐based algorithmic prediction tools in health care, they establish what could be called health care algorithmics—a mode of management of healthcare that produces forms of algorithmic governmentality. Relying on a critical posthumanist perspective, we show how healthcare algorithmics is contingent on the way authority over bodies is produced and how predictive health care algorithms can reproduce inequalities of the worlds from which they are made, centreing possible futures on existing normativities regulated through algorithmic biopower. In contrast to that, we explore posthuman speculative ethics as a way to challenge understanding of ‘ethics’ and ‘care’ in healthcare algorithmics. We suggest some possible avenues towards working speculative ethics into health care while still being critically attentive to algorithmic modes of management and prediction in health care.
Care does not happen in a vacuum, including nursing care. With this in mind, we-Jess, Jane, Jamie, Brandon, and Eva 1 -partnered with critical posthuman scholars Goda Klumbytė from Kassel University in Germany and Dr. Kay Sidebottom from Stirling University in Scotland for a discussion of care. Goda's research straddles critical algorithm studies, systems design, and feminist theory, drawing together these critical perspectives with applied informatics. Kay focuses on Nursing Inquiry.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.