While organizations today make extensive use of complex algorithms, the notion of algorithmic accountability remains an elusive ideal due to the opacity and fluidity of algorithms. In this article, we develop a framework for managing algorithmic accountability that highlights three interrelated dimensions: reputational concerns, engagement strategies, and discourse principles.The framework clarifies: (a) that accountability processes for algorithms are driven by reputational concerns about the epistemic setup, opacity, and outcomes of algorithms; (b) that the way in which organizations practically engage with emergent expectations about algo-This is a preprint of an article that is in press at the Journal of Business Ethics.
Algorithmic Black Boxes as a Challenge for Media StudiesThe primary source for the suspicion with which the rise of and subsequent dependency on software as research instrument in the humanities is met, is that one does not know what the machine does. In many cases 'machine' means algorithm. Algorithmic black boxes have become so widespread that this objection could already be voiced as soon as a researcher uses Google. In digital methods and beyond, there is a dominant tendency for research processes to be dependent upon algorithmic black boxes, which even theoretically cannot be 'opened' (Bucher 2012). Kate Crawford speaks in this context of the 'disappointingly limited calls for algorithmic "transparency", which seem doomed to fail ' (2016: 11).The dependency on algorithmic black boxes has been addressed as a problem for research practices by Bernhard Rieder and Theo Röhle (2012). They have called 'black-boxing' one of the major challenges for digital methods, and continue their pursuit for a solution along the same lines in this volume. They delineate this technical black-boxing as a matter of accessibility (such as in the case of 'the' Google algorithm or countless other proprietary algorithms) and code literacy (cf. ibid.: 76), but also as not-understandable on a 'more abstract' level, as 'the results they produce cannot be easily mapped back to the algorithms and the data they process' (ibid.). Still, Rieder and Röhle propose this should not keep us from using them, as there is a workaround to this, which is 'to use different tools from the same category whenever possible in order to avoid limiting ourselves to a specific perspective' (ibid.: 77). Different algorithms would bring different aspects of a data set to the fore when one experiments with them, switches between different ones, etc.. Thus what Rieder and Röhle have proposed -and continue to seek for in this volume with their focus on the 'bizarre amount of knowledge we have stuffed into our tools' (Rieder & Röhle in this book) -are ways to minimise the size of black boxes by enlightening formerly black parts.With this article however, we would like to draw attention to an approach from a different direction. Instead of focusing on how to gain positive
Much research has been conducted on how social media platforms are used as outlets of taste expression, displaying cultural preferences acquired outside the platforms. This research largely builds on the cultural sociology of Pierre Bourdieu and his analysis of taste as a medium of social distinction. We propose to shift the emphasis from the study of taste expression to an analysis of taste making on social media. This shift is occasioned by broader cultural transformations since the 1990s as well as developments on social media since the late 2000s. We see that rather than merely performing a taste learned elsewhere, users cooperatively develop sensitivities on social media platforms, constituting practices of joint observation, evaluation, and distinction. We call this the triangle of taste in which subjects, objects, and media mutually co-produce each other.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.