This series provides more detailed insight into the General Data Protection Regulation ("GDPR"), which was published on 4 May 2016 and entered into effect on 25 May 2018.
This issue focuses on the rules on automated individual decision-making, taking into account the Article 29 Working Party Guidelines.
Automated individual decision-making and profiling can provide substantial benefits to society and individuals, especially in fields such as medicine and healthcare. However, they also pose significant risks and therefore require appropriate safeguards. By regulating automated decision-making and profiling, the GDPR aims to ensure adequate protection in this respect. It remains to be seen whether this objective will be achieved as there are still much uncertainty surrounding the interpretation of this provision (Article 22 GDPR). Furthermore, it may prove impossible in practice to comply with the rules or the rules may turn out not to offer appropriate protection.
1. Definition of automated individual decision-making and profiling
1.1 Automated individual decision-making
Automated individual decision-making is not defined in the GDPR and refers to the taking of decisions solely by technological means without human involvement.
Example - credit scores
Credit scores indicate the creditworthiness of borrowers. Banks and other lenders use credit scores to decide whether to extend or deny credit and to determine the terms and conditions of loans, including the interest rate.
Profiling is defined as any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.
Example - targeted or behavioural advertising
Tracking cookies are used to gather information about the visitors to a website and their surfing behaviour, from which information about their interests, characteristics and behaviour patterns can be derived. Visitors can then be placed into certain categories or groups and companies can customize their advertising strategy based on this categorization.
Even though automated decision-making often entails profiling and vice versa, the two concepts are not inseparable. Automated decisions can be taken without profiling and, likewise, profiling does not necessarily result in an automated decision.
2. Rights of data subjects with regard to automated decision-making
As a general rule, a data subject has "the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or significantly affects him or her".
This wording requires further examination.
2.1 The right not to be subject to automated decision-making
With regard to the scope of the right, the Article 29 Working Party states that "as a rule, there is a prohibition on fully automated individual decision-making, including profiling that has a legal or similarly significant effect." However, there are certain exceptions to this rule such as consent, when necessary to enter into or perform an agreement, or if authorised by EU or Member State law.
The question arises as to whether Article 22 GDPR should be read as a general prohibition on automated-decision making or rather as a right for the data subject to opt-out. In the former case, it is not clear why the European legislature did not specifically state that automated-decision making producing legal or similarly significant effects is prohibited, except in certain specific cases. The interpretation of this wording will undoubtedly lead to further discussion and hence uncertainty, which does not benefit data subjects or controllers that wish to engage in this type of processing.
2.2 A decision based solely on automated processing
A decision based solely on automated processing is one without any human involvement. In other words, a decision taken by algorithms. The Article 29 Working Party takes the view that controllers cannot circumvent the prohibition set out in Article 22 GDPR "(...) by fabricating human involvement. To qualify as human intervention, the controller must ensure that any oversight of the decision is meaningful, rather than just a token gesture."
2.3 Which produces legal effects
A legal effect means an effect on a person's legal rights (e.g. the right to vote or to take legal action), legal status (e.g. eligibility for income support) or contractual rights.
2.4 Which similarly significantly affects him or her
Neither the GDPR nor the Article 29 Working Party Guidelines explain what this wording means. In order to determine whether a given processing activity significantly affects a data subject, a case-by-case assessment must be made and preferably documented.
3. Cases where automated decision-making is allowed
By way of exception, the GDPR provides that decisions based solely on automated processing producing legal or similarly significant effects are allowed if the decision:
- is necessary to enter into or perform a contract between the data subject and the data controller;
- is authorised by EU or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or
- is based on the data subject's express consent.
In addition, automated decision-making may not be based on special categories of personal data, unless the data subject has expressly consented or if the processing is necessary for reasons of substantial public interest which must be proportionate to the aim pursued and respect the essence of the right to data protection.
In any case, the data controller must take suitable measures to safeguard the data subject's rights, freedoms and legitimate interests.
4. Obligations of the Controller
4.1 Duty to inform
To ensure fair and transparent processing, data subjects must be informed of the processing activities and whether they entail automated decision-making.
The data controller shall, at the time when personal data are obtained, inform the data subject of the existence of automated decision-making, including profiling. At the same time, the data controller shall provide meaningful information about the logic involved, as well as the significance and envisaged consequences of the processing for the data subject.
Data subjects have the right to request access to their personal data used for automated decision-making, including meaningful information about the logic involved and the significance and envisaged consequences of the processing. Access can be requested at any time.
However, it is not always easy to explain the rationale behind automated decision-making as most algorithms are not particularly transparent or predictable. Algorithms are indeed sometimes described as black boxes. Without knowledge of the algorithm's inputs and outputs, explaining it may prove difficult.
4.2 Implementation of safeguards
The data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.
There is nonetheless some uncertainty about the effectiveness of such safeguards as human beings do not necessarily understand the reasoning behind algorithms. Furthermore, as machines are become increasingly intelligent and trustworthy (more so than humans, some would argue) questions can be raised with respect to the desirability of having humans overrule decisions taken by machines. Time will tell whether human involvement genuinely contributes to a higher level of protection for data subjects.
4.3 Implementation of regular assessments
The data controller shall carry out regular assessments on the data sets they process to identify and eliminate potential biases. Data controllers shall also implement appropriate procedures and measures to avoid errors and ensure the accuracy of automated decision-making.
• Recitals 60, 63, 70 and 71
• Articles 4, 13, 14, 15 and 22