Ph.D. (c.) in Private Law 'Humanities and Technology' LIVIA AULINO Ph.D. in Private Law 'Humanities and Technology' DAVIDE SILVIO D'ALOIA Ph.D. (s.) in Private Law 'Humanities and Technology' LUIGI IZZO Ph.D. (s.
The data sets, the processes that determine an algorithmic decision, the rationale of a certain automated decision affecting the legal sphere of a natural person should be traceable, transparent, explained; this is also in order to enable the individual affected to challenge the contents of an unfair decision. Instead, they are rarely so:
either by choice -for reasons of competition, of protection of know-how -or because of technological limitations: this is the case of those algorithms that appropriately are referred to as 'black-box'; systems whose inferential mechanisms are not (completely) predictable ex ante or which, in any case, do not always make it possible to explain why an automated decisionmaking model has generated a particular outcome (and what combination of factors contributed to it). Having affirmed the existence of an ethical duty to transparency of the algorithm and explanation of the (individual) decision reached by automated means, thisPaper wonders whether there is a corresponding right on the level of positive law -and what are its limits, of legal but also technological nature. Critically drawing on the most important scholarly opinions on the subject, the right to explanation of the automated decisionmaking is identified, in the context of the GDPR, in the right of access under Article 15 of the Regulation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.