Italy, Council of State, 8472/2019, supreme administrative instance, 13 December 2019
Member State
Italy
Topic
Use of algorithms and artificial intelligence in administrative procedures
Sector
Judicial Interaction Techniques
Deciding Court Original Language
Deciding Court English translation
Council of State
Date Decision
13 December 2019
National Follow Up Of (when relevant)
Not a direct follow up
EU legal sources and CJEU jurisprudence
Article 42 of the Charter, Articles 13, 14, 15, 22(1) of Regulation 2016/679 (GDPR), Directive 95/46/EC, European Parliament resolution of 16 February 2017 with recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL))
Subject Matter
Appeal before the Consiglio di Stato against a decision of TAR Lazio finding illegitimate the national mobility procedure for teachers hired under Law No. 107/2015. The Council of State confirms the decision of the Regional Administrative Tribunal albeit with different reasons.
Legal issue(s)
Adoption of an extraordinary recruitment plan for teachers under Law No. 107/201. The mobility procedure relied on an undisclosed algorithm that failed to consider teachers' preferences, leading to unfair transfers despite available positions in their preferred locations.
Request for expedited/PPU procedures
No
National Law Sources
Article 97 of the Italian Constitution, Law No. 107/2015, Ministerial Ordinance No. 241/2016, Law No. 241/1990
Facts of the case
The case concerns an appeal filed by the Ministry of Education against a ruling from the Regional Administrative Tribunal (TAR). A group of teachers appointed during the “Phase C” of a special hiring plan under Law No. 107/2015 challenged the national mobility procedure implemented through the Ministerial Ordinance No. 241/2016. The TAR Lazio ruled in favor of the teachers. First, there was no mechanism allowing for exceptions to the five-year tenure requirement for special education teachers, which prevented them from participating in the mobility plan. Second, the mobility procedure relied on an undisclosed algorithm that failed to consider teacher’ preferences, leading to unfair transfers despite available positions in their preferred locations. The Ministry contested the ruling alleging procedural defects and claiming that the algorithm merely implemented existing policies without requiring prior notification and arguing also that there was no disparity in treatment among teachers from different mobility phases.
Reasoning (role of the Charter or other EU, ECHR related legal basis)
The most significant aspect of the ruling was the examination of the use of algorithms in public administration. The Council of State recognized that, while algorithms can enhance efficiency and neutrality in decision-making, they also raise concerns about transparency and accountability.
At first, the Council of State stressed that the use of algorithms responds to the principles of efficiency and cost-effectiveness in administrative action (Article 1 of Law 241/90), which, according to the constitutional principle of good administration (Article 97 of the Constitution), requires public administration to achieve its goals with the least expenditure of means and resources. In the case at hand, the use of an automated procedure leading directly to the final decision should not be stigmatized, but rather, in general, encouraged. It offers numerous advantages, such as a significant reduction in procedural timelines for purely repetitive tasks that lack discretion, the exclusion of interference due to negligence on the part of the official, and the resulting greater guarantee of impartiality in the automated decision-making process.
In addition, the use of computerized procedures cannot be seen as a means to bypass the principles that govern the Italian legal system and regulate administrative activities. Algorithms should be considered as organizational tools and procedural instruments, subject to the checks typical of any administrative procedure. These procedures remain part of the authoritative decision-making process, which must be based on the legislation that grants power and defines the objectives of the public authority. There are no reasons to limit the use of such tools to strictly bound administrative tasks rather than discretionary ones, as both are forms of authoritative action aimed at serving the public interest. Although automated tools may be more easily applied to non-discretionary administrative actions, there is no reason why they cannot also be used for discretionary actions, particularly those of a technical nature.
That said, the Council of State stressed that two key aspects are crucial as minimum guarantees when using algorithms in public decision-making, also in light of EU law. First, the full transparency of the algorithm, and, second, the accountability of the decision to the authority in charge, which must ensure the logical consistency and legality of the decision and its outcomes as determined by the algorithm.
As regards the principle of transparency, it is paramount both for the public authority using the algorithm and for those affected by its decisions. The algorithm must be identifiable in a detailed and accessible way, including information on its creators, the process of its development, the decision-making mechanism, the priorities set in the evaluation process, and the data selected as relevant. This transparency allows for verification that the algorithm’s criteria, assumptions, and outcomes align with the legal requirements and the objectives set by the law or the administration. Additionally, despite the multidisciplinary nature of algorithm, its technical formula must be accompanied by explanations that translate it into the underlying legal rule, thus making it comprehensible. For the persons involved there is also a problem of processing of personal data. The Council of States stresses that Articles 13 and 14 of the GDPR require that individuals be informed if their data is processed through automated decision-making, whether data is collected directly or indirectly. When the process is entirely automated, the data controller must also provide meaningful information about the logic used and the potential consequences of the decision for the individual. In comparison with Directive 95/46/EC, the GDPR thus strengthens the principle of transparency. Article 15 GDPR further protects the individual’s right to access information about the existence of automated decision-making processes. Unlike Articles 13 and 14, Article 15 gives the individuals the right to request information, even if the decision-making process has already started or concluded, without being bound by the time limits set in the other articles. This highlights the importance of transparency for individuals involved in automated administrative processes, both in terms of the data processing and decision-making stages.
As regards the accountability of the decision, the Council of State stresses that a downstream verification must be guaranteed, in terms of the logical consistency and correctness of the results. This ensures the attribution of the decision to the authority holding the power, identified according to the principle of legality, as well as the verification of the subsequent identification of the responsible party, both in the interest of the public administration itself and of the individuals affected by the administrative action entrusted to the algorithm. In this context, the GDPR introduces an additional safeguard by explicitly limiting the use of fully automated decision-making processes. Article 22(1) grants individuals the right not to be subjected to decisions based solely on automation, particularly when such decisions have legal effects or similarly impacts. This ensures that there is always a responsible authority that can assess the legitimacy and rationality of decisions made by an algorithm. The Council of State also refers to the European Parliament’ resolution of 2017 with recommendations to the Commission on Civil Law Rules on Robotics, which addresses the issue of accountability in the context of robots making autonomous decisions. Therefore, according to the Council of State, to apply traditional rules of accountability to damages caused by algorithms, it is necessary to ensure that the final decision can be traced back to the appropriate legal authority.
The Council of State finds confirmation of its statement in three principles of the GDPR. First, the principle of knowability, according to which individuals have the right to know when automated decision-making processes affect them. This applies both to private and public entities, and, in the latter case, the GDPR concretize Article 41 of the Charter. This is complemented by the principle of understandability, meaning that the individual must also be able to understand the logic of the automated decision. The second principle is the non-exclusivity, providing the right for data subject not to be subject to a decision based solely on automated processing which produces legal effects. There must be a human element in the decision-making process that can intervene to validate, challenge, or amend the automated decision. This is often referred to as “human in the loop” (HITL), where human oversight is necessary to ensure the decision’s accuracy and fairness. Third, the recital 71 to the GDPR refers to the principle of non-discrimination, according to which, when personal data is used for profiling, the data controller must employ appropriate statistical or mathematical procedures and implement technical and organizational measures to minimize errors and avoid discrimination. Thus, even if the algorithm is transparent and understandable, it should not produce discriminatory effects. In cases where discrimination is a risk, data inputs must be corrected to avoid biased or unfair outcomes.
In light of that reasoning, the Council of State found that the algorithm used in the case at issue did not adhere to key principles. The algorithmic process failed to explain why certain teachers’ expectations, based on an individual’s ranking, were not met. Thus, the law on administrative procedures, which was conceived in an era when the administration was not yet affected by the technological revolution, cannot be indiscriminately applied, as suggested in the TAR ruling, to all administrative activities involving algorithms. In the present case, the administration merely assumed a coincidence between legality and algorithmic operations, which instead must always be demonstrated and explained technically. Accordingly, the Council of State upheld the TAR ruling but with a different justification. Indeed, the impossibility to understand how the available positions to the post of teacher were assigned through the algorithm constitutes a flaw that invalidates the whole procedure.
Relation of the case to the EU Charter
The Council of State refers to the right to a good administration under Article 41 of the Charter (even though there is a mistake and the text quotes Article 42). It held that the principle of knowability under the GDPR is formulated in general terms and, therefore, is applicable to decisions made by both private and public entities. When the decision is made by a public authority, that norm of the GDPR makes “direct application” Article 41 of the Charter. The Council of State stressed that such right includes the right to be heard, the right to have access to his or her file, and the obligation of the administration to give reasons for its decisions.
Relation between the EU Charter and ECHR
No mention of the ECHR
Use of Judicial Interaction technique(s)
The Council of State assessed the use of automated decision-making instruments in light of the GDPR.
Horizontal Judicial Interaction patterns (Internal – with other national courts, and external – with foreign courts)
N/A
Vertical Judicial Interaction patterns (Internal – with other superior national courts, and external – with European supranational courts)
The Council of State engages with an in-depth assessment of the TAR judgment under appeal and, ultimately, it confirmed the ruling and dismissed the appeal of the Ministry of Education, albeit with some different reasons (see Reasoning). The Council of State also referred to some of its previous judgments (n. 2745/2012 and 2270/2019.). There was no constitutionality review involved, nor citing of the jurisprudence of a foreign Constitutional Court or a European Court.
Strategic use of judicial interaction technique (purpose aimed by the national court)
The Council of State directly applied the principles of the GDPR on the use of automated decision-making instruments to the case at issue in order to solve the dispute pending before it.
Impact on Legislation / Policy
N/A
Notes on the national implementation of the preliminary ruling by the referring court
N/A
Did the national court quote case law of the CJEU/ECtHR (in particular cases not already referred to by the CJEU in its decision) or the Explanations?
N/A
Did the national court quote soft law instruments, such as GRECO Reports, Venice Commission, CEPEJ Reports, or CCEJ Reports?
The Council of State quoted the European Parliament’ resolution of 2017 with recommendations to the Commission on Civil Law Rules on Robotics (see Reasoning).
Did the national court take into account national case law on fundamental rights?
N/A
If the court that issued the preliminary reference is not a last instance court, and the “follow up” was appealed before a higher court, include the information
N/A
Was there a consensus among national courts on how to implement the CJEU's preliminary ruling; and were there divergences between the judiciary and other state powers regarding the implementation of the preliminary ruling?
N/A
Impact on national case law from the same Member State or other Member States
N/A
Connected national caselaw / templates
N/A
Author
Martina Coli, University of Florence (UNIFI)