This article examines the figuration of the home automation device Amazon Echo and its digital assistant Alexa. While most readings of gender and digital assistants choose to foreground the figure of the housewife, I argue that Alexa is instead figured on domestic servants. I examine commercials, Amazon customer reviews, and reviews from tech commentators to make the case that the Echo is modeled on an idealized image of domestic service. It is my contention that this vision functions in various ways to reproduce a relation between device/user that mimics the relation between servant/master in nineteenth- and twentieth-century American homes. Significantly, however, the Echo departs from this historical parallel through its aesthetic coding as a native-speaking, educated, white woman. This aestheticization is problematic insofar as it decontextualizes and depoliticizes the historic reality of domestic service. Further, this figuration misrepresents the direction of power between user and devices in a way that makes contending with issues such as surveillance and digital labor increasingly difficult.
This commentary uses Paul Gilroy’s controversial claim that new technoscientific processes are instituting an ‘end to race’ as a provocation to discuss the epistemological transformation of race in algorithmic culture. We situate Gilroy’s provocation within the context of an abolitionist agenda against racial-thinking, underscoring the relationship between his post-race polemic and a post-visual discourse. We then discuss the challenges of studying race within regimes of computation, which rely on structures that are, for the most part, opaque; in particular, modes of classification that operate through proxies and abstractions and that figure racialized bodies not as single, coherent subjects, but as shifting clusters of data. We argue that in this new regime, race emerges as an epiphenomenon of processes of classifying and sorting – what we call ‘racial formations as data formations’. This discussion is significant because it raises new theoretical, methodological and political questions for scholars of media and critical algorithmic studies. It asks: how are we supposed to think, to identify and to confront race and racialisation when they vanish into algorithmic systems that are beyond our perception? What becomes of racial formations in post-visual regimes?
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.