Editorial of March 2024

By the Alessandra Silveira 
▪

On inferred personal data and the difficulties of EU law in dealing with this matter

The right not to be subject to automated decisions was considered for the first time before the Court of Justice of the European Union (CJEU) in the recent SCHUFA judgment. Article 22 GDPR (on individual decisions based solely on automated processing, including profiling) always raised many doubts to legal scholars:[1] i) what a decision taken “solely” on the basis of automated processing would be?; ii) would this Article provide for a right or, rather, a general prohibition whose application does not require the party concerned to actively invoke a right?; iii) to what extent this automated decision produces legal effects or significantly affects the data subject in a similar manner?; iv) will the provisions of Article 22 GDPR only apply where there is no relevant human intervention in the decision-making process?; v) if a human being examines and weighs other factors when making the final decision, will it not be made “solely” based on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?

To these doubts a German court has added a few more. SCHUFA is a private company under German law which provides its contractual partners with information on the creditworthiness of third parties, in particular, consumers. To that end, it establishes a prognosis on the probability of a future behaviour of a person (‘score’), such as the repayment of a loan, based on certain characteristics of that person, on the basis of mathematical and statistical procedures. The establishment of scores (‘scoring’) is based on the assumption that, by assigning a person to a group of other persons with comparable characteristics who have behaved in a certain way, similar behaviour can be predicted.[2]

SCHUFA provided a financial entity with a score for the applicant OQ, which served as the basis for refusing to grant the credit for which the latter had applied. That citizen subsequently requested SCHUFA to erase the entry concerning her and to grant her access to the corresponding data. However, SCHUFA merely informed her of the relevant score and, in general terms, of the principles underlying the method of calculating the score, without informing her of the specific data included in that calculation, or of the relevance attributed to them in that context, arguing that the method of calculation was a trade secret.

However, according to the referring court, it is ultimately the credit score established by credit information agencies that actually decides whether and how a financial entity/bank enters into a contract with the data subject. The referring court proceeds on the assumption that the establishment of a score by a credit information agency does not merely serve to prepare that bank’s decision, but constitutes an independent “decision” within the meaning of Article 22(1) of the GDPR.[3]

As we have highlighted in this blog,[4] this case law is particularly relevant because profiling is often used to make predictions about individuals. It involves collecting information about a person and assessing their characteristics or patterns of behaviour in order to place them in a particular category or group and to draw on that inference or prediction – whether of their ability to perform a task, their interest or presumed behaviour, etc. To this extent, such automated inferences demand protection as inferred personal data, since they also make it possible to identify someone by association of concepts, characteristics, or contents. The crux of the matter is that people are increasingly losing control over such automated inferences and how they are perceived and evaluated by others.

In SCHUFA case the CJEU was called upon to clarify the scope of the regulatory powers that certain provisions of the GDPR bestow on the national legislature, namely the exception to the prohibition in Article 22(2)(b) GDPR – according to which such prohibition does not apply if the decision is authorized by European Union or Member State law to which the controller is subject. This is relevant because, if Article 22(1) GDPR were to be interpreted as meaning that the establishment of a score by a credit information agency is an independent decision within the meaning of Article 22(1) of the GDPR, that activity would be subject to the prohibition of automated individual decisions. Consequently, it would require a legal basis under Member State law within the meaning of Article 22(2)(b) of the GDPR.

So, what is new about this ruling? Firstly, the CJEU ruled that Article 22(1) of the GDPR provides for a prohibition tout court whose violation does not need to be invoked individually by the data subject. In other words, this provision rules out the possibility of the data subject being the object of a decision taken solely on the basis of automated processing, including profiling, and clarifies that active behaviour by the data subject is not necessary to make this prohibition effective. [5] In any case, the prohibition will not be applicable when the conditions established under Article 22(2) and Recital 71 of the GDPR are applicable. That is to say, the adoption of a decision based solely on automated processing is authorised only in the cases referred to in that Article 22(2), namely when: i) it is necessary for entering into, or performance of, a contract between the data subject and a data controller [paragraph a)]; ii) it is authorised by Union or Member State law to which the controller is subject [paragraph b)]; or iii) it is based on the data subject’s explicit consent [paragraph c)]. [6]

In second place, the CJEU clarified the extent to which Member State law is authorized to establish exceptions to the prohibition under Article 22(2)(b) of the GDPR. According to the CJEU, it follows from the very wording of this provision that national law authorizing the adoption of an automated individual decision must provide for appropriate measures to safeguard the rights and freedoms and the legitimate interests of the data subject. In light of Recital 71 of the GDPR, such measures should include appropriate mathematical or statistical procedures for the profiling, implementing technical and organisational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimised, securing personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons. The SCHUFA case also made clear that the data subject has the right to i) obtain human intervention; ii) to express his or her point of view; and iii) to challenge the decision. The CJEU has thus dispelled any doubts as to whether the national legislator is bound by the rights provided for in Article 22(3) of the GDPR, despite the somewhat equivocal wording of this provision, which textually only refers to Article 22(2)(a) and (c), seemingly to exclude Member States from that obligation. [7] The CJEU also added that Member States may not adopt, under Article 22(2)(b) of the GDPR, rules that authorise profiling in violation of the principles and legal bases imposed by Articles 5 and 6 of the GDPR, as interpreted by CJEU case law. [8]

Finally, the CJEU recognized the broad scope of the concept of “decision” within the meaning of the GDPR, ruling that a profile may be in itself an exclusively automated decision within the meaning of Article 22 of the GDPR. The CJEU explained that there would be a risk of circumventing Article 22 of the GDPR and, consequently, a lacuna in legal protection if a restrictive interpretation of that provision was adopted, according to which the establishment of the probability value must only be considered as a preparatory act and only the act adopted by the third party can, where appropriate, be classified as a “decision” within the meaning of Article 22(1). [9] Indeed, in that situation, the establishment of a probability value such as that at issue in the main proceedings would escape the specific requirements provided for in Article 22(2) to (4) of the GDPR, even though that procedure is based on automated processing and that it produces effects significantly affecting the data subject, to the extent that the action of the third party to whom that probability value is transmitted draws strongly on it. This would also result in the data subject not being able to assert, from the credit information agency which establishes the probability value concerning him or her, his or her right of access to the specific information referred to in Article 15(1)(h) of the GDPR, in the absence of automated decision-making by that company. Even assuming that the act adopted by the third party falls within the scope of Article 22(1) in so far as it fulfils the conditions for application of that provision, that third party would not be able to provide that specific information because it generally does not have it. [10]

In short, the fact that the determination of a probability value is covered by Article 22(1) of the GDPR results in its prohibition, unless one of the exceptions set out in Article 22(2) of the GDPR applies – including authorization by the law of the Member State, a possibility which the CJEU has interpreted restrictively – and the specific requirements set out in Article 22(3) and (4) of the GDPR are complied with.

However, the CJEU’s decision in SCHUFA still leaves many questions without a clear response. Considering the specific request for a preliminary ruling, the CJEU answered that Article 22(1) of the GDPR must be interpreted as meaning that the automated establishment, by a credit information agency, of a probability value based on personal data relating to a person and concerning his or her ability to meet payment commitments in the future, it constitutes “automated individual decision-making” within the meaning of that provision, where a third party, to which that probability value is transmitted, draws strongly on that probability value to establish, implement or terminate a contractual relationship with that person(our italics).[11]

Despite the fact that the CJEU’s answer results from the specific wording of the question referred for a preliminary ruling – as written by the national judge who is the “master” of the referral – the question remains as to the extent of the CJEU’s answer. Did the CJEU perhaps admit that profiling is, in itself, an exclusively automated decision – and, in principle, prohibited – but only when the probability value is decisive for the decision on the contractual relationship? Would not this confirm the idea, rejected by the CJEU in Recital 61 of the SCHUFA case, that the determination of the probability value would be a simple preparatory act? And if the probability value is not decisive for the decision on the contractual relationship, then does the prohibition in Article 22 of the GDPR no longer apply?

As we have previously argued on this blog, the problem should be seen as profiling itself, regardless of whether or not it is decisive for the decision of a third party. When profiling produces legal effects or similarly significantly affects a data subject it should be seen as an automated decision in accordance to Article 22 of the GDPR. The purpose of Article 22 of the GDPR is to protect individuals against specific risks to their rights and freedoms arising from the automated processing of personal data, including profiling – as the CJEU explains in paragraph 57 of the judgment in question. And that processing involves, as is clear from Recital 71 of the GDPR, the assessment of personal aspects relating to the natural person affected by that processing, in particular the analysis and prediction of aspects relating to that person’s work performance, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements – as the CJEU rightly explains in paragraph 58 of the judgment in question.

It is important to remember that profiling always includes inferences and predictions about the individual, regardless the application of automated individual decisions based on profiling by a third party. To create a profile it is necessary to go through three distinct phases: i) data collection; ii) automated analysis to identify correlations; and iii) applying the correlations to an individual to identify present or future behavioral characteristics. If there were perhaps automated individual decisions based on profiling, these would also be subject to the GDPR – whether exclusively automated or not. That is, profiling is not limited to the mere categorization of the individual, but it also includes inferences and predictions about the individual. However, the effectiveness of the application of the GDPR to inferred data faces several obstacles. This has to do with fact that the GDPR was designed for data provided directly by the data subject – and not for data inferred by digital technologies as AI systems. This is the difficulty behind this judgment.


[1] See Alessandra Silveira, Profiling and cybersecurity: a perspective from fundamental rights’ protection in the EU, Francisco Andrade/Joana Abreu/Pedro Freitas (eds.), “Legal developments on cybersecurity and related fields”, Springer International Publishing, Cham/Suíça, 2024.

[2] See Judgment SCHUFA, paragraph 14.

[3] See Request for a preliminary ruling of 1 October 2021, Case C-634/21, paragraph 23.

[4] See Alessandra Silveira, Finally, the ECJ is interpreting Article 22 GDPR (on individual decisions based solely on automated processing, including profiling), https://officialblogofunio.com/2023/04/10/finally-the-ecj-is-interpreting-article-22-gdpr-on-individual-decisions-based-solely-on-automated-processing-including-profiling/

[5] See Judgment SCHUFA, paragraph 52.

[6] See Judgment SCHUFA, paragraph 53.

[7] See Judgment SCHUFA, paragraph 65 and 66.

[8] See Judgment SCHUFA, paragraph 68. See also the ECJ decision in the Joined Cases C‑26/22 and C‑64/22.

[9] See Judgment SCHUFA, paragraph 61.

[10] See Judgment SCHUFA, paragraph 63.

[11] See Judgment SCHUFA, paragraph 73.

Picture credits: Photo by Pixabay on Pexels.com.

 
Author: UNIO-EU Law Journal (Source: https://officialblogofunio.com/2024/03/19/editorial-of-march-2024/)