Ksenia Ermoshina (PhD, MINES ParisTech) is a postdoctoral researcher at an Associate Researcher at the Citizen Lab, Munk School of Global Affairs, University of Toronto. Her research focuses on information operations within the Russian-Ukrainian armed conflict, including digital threats to journalists and civil society organizations, Internet censorship, and surveillance. Her previous work as a postdoctoral fellow with the NEXTLEAP research project studied the usage and development of end-to-end encrypted messaging and email protocols and clients. Ksenia will join the ranks of 's revelations, encryption of online communications at a large scale and in a usable manner has become a matter of public concern. The most advanced and popular among recently-developed encryption protocols is currently the Signal protocol. While the Signal protocol is widely adopted and considered as an improvement over previous ones, it remains officially unstandardized, even though there is an informal draft elaborated towards that goal. The analysis of how this protocol was introduced and swiftly adopted by various applications, and of subsequent transformations of the encrypted messaging ecosystem, sheds light on how a particular period in the history of secure messaging has been marked by a "de facto standardization". What can we learn about existing modes of governance of encryption and the histories of traditional standardization bodies, when analyzing the approach of "standardization by running code" adopted by Signal? And finally, how does the Signal protocol challenge a "linear", evolution-based vision of messaging history? Drawing from a three-year qualitative investigation of end-to-end encrypted messaging, from a perspective informed by science and technology studies (STS), we seek to unveil the ensemble of processes that make the Signal protocol a quasi-standard. West, 2018). Alongside the turning of encryption into a full-fledged political issue, the Snowden revelations catalyzed long-standing debates within the field of secure messaging protocols. The cryptography community (in particular, academic and free software collectives) renewed their efforts to create next-generation secure messaging protocols in order to overcome the limits of existing protocols, such as PGP (Pretty Good Privacy) and OTR (Off-the-Record Messaging). As next-generation encryption is shaping the ways in which we can securely communicate, exchange, store content on the Internet, it is important to unveil the recent and less-recent history of these protocols and their key applications, to understand how the opportunities and constraints they provide to Internet users came about, and how both developer communities and institutions are working towards making them available for the largest numbers.One of the leading motivations behind this effort consisted in facilitating key exchange and key verification processes1, previously identified as the main obstacles to mass adoption of encryption (Whitten & Tygar, 1999). The most advanced and popular of these n...