GDPR – Article 29 Working Party Guidance on Profiling – RECENTLY ISSUED

27 October 2017

R_craig

The Article 29 Data Protection Working Party (the “Working Party”) has recently issued guidance on profiling and automated individual decision-making (the “Guidance”) and is accepting comments on its draft Guidance until 28 November. This memo does not provide a comprehensive summary of the Guidance but is a note of a number of headline points.

Whilst profiling and automated decision-making have many benefits, they can also “pose significant risks for individuals’ rights and freedoms which require appropriate safeguards.”[1] The Guidance seeks to clarify the provisions in the GDPR relating to profiling and automated decision-making.

Article 22: guidance

Subject to the limited exemptions set out in Article 22(2), Article 22(1) of the GDPR provides that: “the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”

The Working Party has clarified that, to fall outside the scope of Article 22, human intervention must be “meaningful, rather than just a token gesture”.[2] The Guidance also clarifies that “legal effect” refers to an individual’s statutory or contractual rights, and “similarly significant effects” are those which are “more than trivial” and which are equivalent to or similarly significant to “legal effect”.[3]

Positively, the Working Party has observed that in many “typical” cases, the use of targeted advertising does not have a significant effect on individuals[4] (and therefore falls beyond the scope of Article 22). However, the Working Party notes that there may be instances where such advertising could have a significant effect – for example, if it is directed at a certain group of society such as minority groups or vulnerable adults, or where it results in differential pricing.

The Guidance provides a useful reminder for data controllers of their transparency responsibilities: “given the potential risks and interference that profiling caught by Article 22 poses to the rights of data subjects, data controllers should be particularly mindful of their transparency obligations. This means ensuring that information about the profiling is not only easily accessible for a data subject but that it is brought to their attention.”[5]

The Guidance considers the exemptions in Article 22(2), together with the requirement to establish appropriate safeguards, the detail of which is beyond the scope of this brief memo.

General provisions on profiling and automated decision making

The Guidance considers the application of a number of principles and issues to profiling and automated decision-making including, for example, the need for a lawful basis for profiling / automated decision-making; the data subject’s right to object to profiling even if there is no automated decision-making; and the requirement for data controllers to provide concise, transparent, intelligible and easily accessible information about data processing.

The Working Party observes that Article 22 makes no distinction as to whether the processing concerns adults or children and that there is therefore no absolute prohibition on this kind of processing in relation to children. However, the Working Party notes that recital 71 of the GDPR does state that solely automated decision-making, including profiling, with legal or similarly significant effects, should not apply to children – and as such the Working Party recommends that, where possible, data controllers should not seek to rely upon the exemptions in Article 22(2) to justify such processing. As a recommendation, the Working Party comments that since “children represent a more vulnerable group of society, organisations should, in general, refrain from profiling them for marketing purposes.”[6]

The Guidance reminds data controllers of the importance of accountability under the GDPR and observes that data controllers may wish to consider undertaking a data protection impact assessment (“DPIA”) to assess the risks involved in automated decision-making, including profiling. A DPIA is a useful way of showing “that suitable measures have been put in place to address those risks and demonstrate compliance with the GDPR.”[7]

[1] Article 29 Data Protection Working Party: Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 adopted on 3 October 2017 (the “Guidelines”), from Introduction section

[2] Guidelines, Chapter II Part A

[3] Guidelines, Chapter II Part B

[4] Guidelines, Chapter II Part B

[5] Guidelines, Chapter II Part D

[6] Guidance, chapter IV

[7] Guidance, chapter V

Back