ECJ Strengthens Transparency in Automated Decision-Making
On 27 February 2025, the European Court of Justice (ECJ) issued a landmark ruling in case C-203/22 affirming that organizations using automated decision-making systems—including AI-driven scoring models—must provide data subjects with clear, precise, and comprehensible information about the logic behind such processes. The judgment emphasizes that affected individuals have the right to understand how decisions concerning them are reached under Article 15(1)(h) GDPR.
The case concerned the use of algorithmic scoring in a credit assessment context. The ECJ held that the principle of transparency requires more than generic descriptions; data subjects must receive information enabling them to grasp the decisive factors influencing the automated decision. Simply referring to the complexity of algorithms or invoking trade secrets is not sufficient justification to withhold meaningful explanations.
Importantly, the court clarified that business confidentiality cannot override fundamental rights. While companies may protect sensitive know-how, they must still disclose enough information to allow individuals to exercise their rights under the GDPR—such as contesting decisions or seeking human intervention under Article 22 GDPR.
This judgment sets a strong precedent for all organizations deploying AI and scoring technologies within the EU. It signals that transparency obligations extend deeply into technical systems, compelling companies to strike a careful balance between protecting intellectual property and upholding the informational self-determination of individuals.