The material cannot be used for any other purpose without further permission of the publisher, and is for private use only.
2/16that influence what information is seen online, result from private endeavour' (UNHRC, 2016b). One area of concern is the self-regulatory measures of these corporate platforms, such as blocking of alleged illegal or harmful content. The concern is particularly related to the platforms' means of 'content regulation' and privacy practices; for example, their day-today decisions on which content to remove or leave up, and the extent to which they collect, process, and exchange personal data with third parties. Similar issues are at hand at 'lower layers' of the services where companies in the business of providing internet infrastructure and hardware decide on the protocols and standards that influence what is, and what is not, possible on the internet (Cath & Floridi, 2017). Scholars and civil society groups have warned of a governance gap, where private actors with strong human rights impacts steer in the soft regime of guidelines and corporate social responsibility, with no legally binding human rights obligations, except when human rights issues have been transposed into national or regional legislation (see, e.g., Callamard, 2018;Laidlaw, 2015). The 2016 European Union (EU, 2016) General Data Protection Regulation, for example, stipulates privacy and data protection rules on companies doing business in EU member states.