Why data protection law is uniquely equipped to let us fight a pandemic with personal data

Image by ar130405 from Pixabay

Data protection law is different than “privacy”. We, data protection lawyers, have been complacent recently and have failed to clarify this loud and clear for the general public. Perhaps happy to finally see this field of law taking the front stage of public debate through the GDPR, we have not stopped anyone from saying that the GDPR is a privacy law.

The truth is, the GDPR is a “data protection” law (it stands for the General “Data Protection” Regulation). And this makes a world of difference these days, when governments, individuals, companies, public health authorities are looking at the collection of personal data and digital tracking of people as a potential effective way to stop the spread of the COVID-19 pandemic.

The GDPR is the culmination of about half a century of legislative developments in Europe, which saw data protection evolve from a preoccupation of regional laws, to national laws, to EU laws, to a fundamental right in the EU Charter of Fundamental Rights. A fundamental right (Article 8) which is provided for distinctly than the fundamental right to respect for private and family life (Article 7). What a wonderous distinction!

The right to the protection of personal data has been conceived particularly to support societies in facing the reality of massive automation of systems fed with data about individuals. At the very beginning, the introduction of computerized databases in public administration pushed for the necessity of adopting detailed safeguards that would ensure the rights of individuals are not breached by the collection and use of their data.  

In the following decades, waves of development added layers to those safeguards and shaped data protection law as we know it today, layers such as the need for a justification to collect and use personal data; fair information principles like purpose limitation and data minimization; transparency and fairness; control of data subjects over their own data through specific rights like access, correction and deletion; the need of having a dedicated, independent supervisory authority to explain and enforce data protection law; accountability of whomever is responsible for the collection and use of personal data.

The right to data protection is procedural in nature. It does have a flavor of substantial protection, which will certainly grow in importance and will likely be developed in the age of AI and Machine Learning – in particular I am thinking of fairness, but at its core the right to data protection remains procedural. Data protection sets up specific measures or safeguards that must be implemented to reach its goal, in relation to personal data being collected and used.

Importantly, the goal of data protection is to ensure that information relating to individuals are collected and used in such a way that all their other fundamental rights are protected. This includes freedom of speech, the right to private life/privacy, the right to life, the right to security, the right to non-discrimination and so on. Even though I have not seen this spelled out anywhere, I believe it has also been developed to support the rule of law.

This is why data protection is uniquely equipped to let us fight the pandemic using personal data. It has literally been conceived and developed to allow the use of personal data by automated systems in a way that guarantees the rule of law and the respect of all fundamental rights. This might be the golden hour for data protection.

That is, if its imperatives are being applied to any technological or digital responses to the COVID-19 pandemic relying on personal data:

  • The dataflow proposed must be clear, including all the categories of data that will be collected and used.
  • The purpose(s) must be clear, specific, granular, well-defined.
  • Have a lawful ground for processing in place.
  • Building any solution that necessitates personal data must be done by taking into account from the outset data protection requirements (data protection by design).
  • The web of responsibility must be clear (who are the controllers and the processors?).
  • Personal data must not be shared, or given access to, beyond the defined web of responsibility (for example, through controller-processor agreements).
  • There must be transparency in an intelligible way for the individuals whose personal data are collected.
  • The necessity of collecting any of the personal data items must be assessed (can the project do without some of them and achieve the same purpose?).
  • All personal data must be accurate.
  • Ensure that individuals have a way to obtain access to their own data and to ask for correction, erasure if it is justified (as well as for the other rights they have).
  • Ensure the security of data.
  • The personal data collected must be retained only for as long as it is necessary to achieve the purpose (afterwards, it must be deleted; anonymization may be accepted as an alternative to deletion, but there is an ongoing debate about this).
  • Data Protection Impact Assessments (even if loose) should be conducted and then engaging with supervisory authorities to discuss the risks identified which cannot be mitigated could be helpful (and may even be obligatory under certain circumstances).

Therefore, all the data-based solutions proposed to diminish the effects of the COVID-19 pandemic are not being proposed and accepted in Europe in spite of the GDPR, as media has been portraying it. It is almost as if data protection has been developing in the past half a century to give us the right instruments to be able to face this challenge and preserve our freedoms and our democracies. I hope we will be smart enough to properly use them.

A US Bill from 1974 shares so much DNA with the GDPR, it could be its ancestor

America’s own GDPR was introduced in Congress in 1974. This Bill applied to government and companies, it restricted international transfers and offered U.S. and foreign “data subjects” rights to access, erasure and even… explanation.

The U.S. has been recently working towards finally adopting comprehensive privacy and data protection rules, with unfolding efforts both at federal and state level. Until now, only Californians can claim they actually achieved something on the road to protecting their rights impacted by the widespread collection and use of personal information. Other serious efforts are undergoing in Washington State, but they may end up being undermined by good intentions.

These developments are possible right now due to a combination of EU’s General Data Protection Regulation’s (GDPR) global reach and notoriety, the countless privacy scandals affecting Americans, and the absence of comprehensive statutory legal protections in the U.S. of privacy and other individual rights that may be affected by the collection and use of personal information.

But did you know this is not the first time the U.S. is having privacy law fever? In the late ’60s and early ’70s, American lawmakers were concerned about the rise of automated data processing and computerized databases. Serious efforts were put into analyzing how the rights of the American people could be protected against misuses and abuses of personal information. The Fair Credit Reporting Act was adopted in 1970. An influential Report was published in 1973 by the Department of Health, Education and Welfare (HEW) proposing a set of Fair Information Practice Principles built on an impressive, meticulous analysis (read it if you haven’t done so yet; bonus: it’s peppered with smart literary mottos in between chapters). The Report called for comprehensive federal privacy legislation applicable both to government and companies.

About six months after the publication of the HEW Report, in January 1974, Bill S.3418 was introduced in the US Senate by three Senators — Ervin, Percy and Muskie, ‘to establish a Federal Privacy Board, to oversee the gathering and disclosure of information concerning individuals, and to provide management systems in all Federal agencies, State and local governments, and other organizations’.

This Bill was clearly ahead of its time and aged astoundingly well, especially when compared to some of the key characteristics of the GDPR — the current global golden standard for comprehensive data protection law:

It applied to both public and private sectors, at federal and state level

The Bill had a very broad scope of application. It covered the activity of “organizations” defined as any Federal agencies; the government of the District of Columbia; any authority of any State, local government, or other jurisdiction; any public or private entity engaged in business for profit. It only exempted from its rules information systems pertaining to Federal agencies that were vital to national defense, as well as criminal investigatory files of Federal, State or local law enforcement and any information maintained by the press or news media, except for information related to their employees.

It created a Federal Privacy Board to oversee its application

The Federal Privacy Board would have been created as part of the Executive branch, composed of five members appointed by the President with the approval of the Senate, for a three year mandate. The Board would have been granted effective powers to investigate violations of the law — including by being granted admission to the premises where any information system or computers are kept, to recommend either criminal or civil penalties, and to actually order any organization found in breach of the law ’to cease and desist such violation’.

It equally protected the rights of Americans and foreigners as data subjects

It’s quite difficult to believe it (especially in the context of the endless Transatlantic debates that ultimately lead to the Judicial Redress Act), but this Bill explicitly protected “any data subject of a foreign nationality, whether residing in the United States or not” by requiring organizations to afford them “the same rights under this Act as are afforded to citizens in the United States”. Such a broad personal scope has been a characteristic of the European data protection law framework even before the GDPR. It also made possible the legal challenges brought in the UK against Cambridge Analytica by David Caroll, a U.S. citizen residing in New York.

It provided restrictions for international data transfers to jurisdictions which did not apply the protections enshrined in the Bill

Under this Bill, organizations were required to “transfer no personal information beyond the jurisdiction of the United States without specific authorization from the data subject or pursuant to a treaty or executive agreement in force guaranteeing that any foreign government or organization receiving personal information will comply with the applicable provisions of this Act with respect to such information”. The idea of restricting transfers of personal data to countries which do not ensure a similar level of protection is a staple of the EU data protection law regime and the source of some of the biggest EU-US tensions related to tech and data governance.

It provided for rights of access to, correction, “purging” of personal information. And for notification of purging to former recipients!

The Bill provided for an extensive right of access to one’s own personal information. It required organizations to grant data subjects “the right to inspect, in a form comprehensible” all personal information related to them, the nature of the sources of the information and the recipients of the personal information. In addition, it also granted individuals the right to challenge and correct information. As part of this right to challenge and correct information, the Bill even provided for a kind of “right to be forgotten”, since it asked organizations to “purge any such information that is found to be incomplete, inaccurate, not pertinent, not timely nor necessary to be retained, or can no longer be verified”. Moreover, the Bill also required organizations to “furnish to past recipients of such information notification that the item has been purged or corrected” at the request of the data subject.

It provided for transparency rights into statistical models and receiving some explanation

The same provision granting a right to challenge and correct personal information referred also to individuals wishing “to explain” information about them in information systems, but it is not clear how organizations should have particularly responded to explanation requests. Elsewhere in the Bill, organizations “maintaining an information system that disseminates statistical reports or research findings based on personal information drawn from the system, or from systems of other organizations” were required to “make available to any data subject (without revealing trade secrets) methodology and materials necessary to validate statistical analyses” (!). Moreover, those organizations were also asked not to make information available for independent analysis “without guarantees that no personal information will be used in a way that might prejudice judgments about any data subject”.

It provided some rules even for collection of personal information

One of the key questions to ask about data protection legislation generally is whether it intervenes at the time of collection of personal data, as opposed to merely regulating its use. This Bill cared about collection too. It provided that organizations must “collect, maintain, use and disseminate only personal information necessary to accomplish a proper purpose of the organization”, “collect information to the greatest extent possible from the data subject directly” and even “collect no personal information concerning the political or religious beliefs, affiliations, and activities of data subjects which is maintained, used or disseminated in or by any information system operated by any governmental agency, unless authorized by law”.

There are other remarkable features of this Bill that remind of features of the GDPR, such as broad definitions of personal information and data subjects (“an individual about whom personal information is indexed or may be located under his name, personal number, or other identifiable particulars, in an information system”) and show sophisticated thinking about managing the impact automated processing of personal data might have on the rights of individuals. Enforcement of the Bill included criminal and civil penalties applied with the help of the U.S. Attorney General and the Federal Privacy Board, as well as a private right of action limited only to breaches of the right to access personal information.

So what happened to it? Throughout the legislative process in Congress, this Bill was almost completely rewritten and it ultimately became the US Privacy Act 1974 — a privacy law quite limited in scope (applicable only to Federal agencies) and ambitions compared to the initial proposal. The answer about what might have happened during this process to fundamentally rewrite the Bill is somewhere in these 1466 pages recording the debates around the US Privacy Act of 1974.

Be it a failed attempt to provide comprehensive data protection and privacy legislation in the U.S., it nonetheless shows how much common thinking is shared by Europe and America. At the same time this Bill was introduced in the U.S. Senate, Europe was having its own data protection law fever, with many legislative proposals being discussed in Western Europe after the first data protection law was adopted in 1970 in the German land of Hesse. But according to Frits Hondius, a Dutch scholar documenting these efforts in his volume “Emerging Data Protection in Europe” published in 1975:

“A factor of considerable influence was the development of data protection on the American scene. Almost every issue that arose in Europe was also an issue in the United States, but at an earlier time and on a more dramatic scale. (…) The writings by American authors about privacy and computers (e.g. Westin and Miller), the 1966 congressional hearings, and the examples set by federal and state legislation, such as the US Fair Credit Reporting Act 1970 and the US Privacy Act 1974, have made a deep impact on data protection legislation in Europe.”

After a shared start in the late ‘60s and early ‘70s, the two privacy and data protection law regimes evolved significantly different. Almost half a century later, it seems to be Europe’s turn to impact the data protection and privacy law debate in the U.S..

Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking

The Court of Justice of the European Union published yesterday its long-awaited judgment in the Planet49 case, referred by a German Court in proceedings initiated by a non-governmental consumer protection organization representing the participants to an online lottery. It dealt with questions which should have been clarified long time ago, after Article 5(3) was introduced in Directive 2002/58 (the ‘ePrivacy Directive’) by an amendment from 2009, with Member States transposing and then applying its requirements anachronistically:

  • Is obtaining consent through a pre-ticked box valid when placing cookies on website users’ devices?
  • Must the notice given to the user when obtaining consent include the duration of the operation of the cookies being placed and whether or not third parties may have access to those cookies?
  • Does it matter for the application of the ePrivacy rules whether the data accessed through the cookies being placed is personal or non-personal?

The Court answered all of the above, while at the same time signaling to Member States that a disparate approach in transposing and implementing the ePrivacy Directive is not consistent with EU law, and setting clear guidance on what ‘specific’, ‘unambiguous’ and ‘informed’ consent means.

The core of the Court findings is that:

  • pre-ticked boxes do not amount to valid consent,
  • expiration date of cookies and third party sharing should be disclosed to users when obtaining consent,
  • different purposes should not be bundled under the same consent ask,
  • in order for consent to be valid ‘an active behaviour with a clear view’ (which I read as ‘intention’) of consenting should be obtained (so claiming in notices that consent is obtained by having users continuing to use the website very likely does not meet this threshold) and,
  • (quite consequential), these rules apply to cookies regardless of whether the data accessed is personal or not.

Unfortunately, though, the Court did not tackle one other very important issue: what does ‘freely given’ consent mean? In other words, would requiring and obtaining consent for placing cookies with the purpose of online tracking for behavioural advertising as a condition to access an online service, such as an online lottery (as in Planet49’s case), be considered as ‘freely given’ consent?

An answer to this question would have affected all online publishers and online service providers that condition access to their services to allowing online behaviour tracking cookies being installed on user devices and rely on ‘cookie walls’ as a source of income for their businesses. What is interesting is that the Court included a paragraph in the judgment specifically enunciating that it does not give its view on this issue because it was not asked to do so by the referring German Court (paragraph 64). Notably, ‘freely given’ is the only of the four conditions for valid consent that the Court did not assess in its judgment and that it specifically singled out as being left out in the open.

Finally, one very important point to highlight is that the entirety of the findings were made under the rules for valid consent as they were provided by Directive 95/46. The Court even specified that its finding concerning ‘unambiguous’ consent is made under the old directive. This is relevant because the definition of consent in Article 2(h) of Directive 95/46 only refers to ‘any freely given specific and informed indication’ of agreement. However, Article 7(a) of the directive provides that the data subject’s consent may make a processing lawful if it was given ‘unambiguously’.

With the GDPR, the four scattered conditions have been gathered under Article 4(11) and have been reinforced by clearer recitals. The fact remains that conditions for valid consent were just as strong under Directive 95/46. The Court almost ostensibly highlights that its interpretation is made on the conditions provided under the old legal regime and they only apply to the GDPR ‘a fortiori‘ (paragraph 60); (see here for what a fortiori means in legal interpretation).

Consequently, it seems that consent obtained for placing cookies with the help of pre-ticked boxes or through inaction or action without intent to give consent, even prior to the GDPR entering into force, has been unlawfully obtained. It remains to be seen if any action by supervisory authorities will follow to tackle some of those collections of data built relying on unlawfully obtained consent, or whether they will take a clean slate approach.

For a deeper dive into the key findings of the Planet49 CJEU judgment, read below:

Discrepancies in applying ePrivacy at Member State level, unjustifiable based on Directive’s text

Before assessing the questions referred on substance, the Court makes some preliminary findings. Among them, it finds that ‘the need for a uniform application of EU law and the principle of equality require that the wording of a provision of EU law which makes no express reference to the law of the Member States for the purpose of determining its meaning and scope must normally be given an autonomous and uniform interpretation throughout the European Union’ (paragraph 47). Article 5(3) of the ePrivacy Directive does not provide any room for Member State law to determine the scope and meaning of its provisions, by being sufficiently clear and precise in what it asks the Member States to do (see paragraph 46 for the Court’s argument).

In practice, divergent transposition and implementation of the ePrivacy Directive has created different regimes across the Union, which had consequences for the effectiveness of its enforcement.

‘Unambiguous’ means ‘active behavior’ and intent to give consent

The Court starts its assessment from a linguistic interpretation of the wording of Article 5(3) of Directive 2002/58. It notes that the provision doesn’t require a specific way of obtaining consent to the storage of and access to cookies on users’ devices. The Court observes that ‘the wording ‘given his or her consent’ does however lend itself to a literal interpretation according to which action is required on the part of the user in order to give his or her consent.

In that regard, it is clear from recital 17 of Directive 2002/58 that, for the purposes of that directive, a user’s consent may be given by any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an internet website‘ (paragraph 49).

The Court highlights that per Article 2(f) of Directive 2002/58 the meaning of a user’s ‘consent’ under the ePrivacy Directive is meant to be the same as that of a data subject’s consent under Directive 95/46 (paragraph 50). By referring to Article 2(h) of the former data protection directive, the Court observes that ‘the requirement of an ‘indication’ of the data subject’s wishes clearly points to active, rather than passive, behaviour’ (paragraph 52). The Court then concludes that ‘consent given in the form of a preselected tick in a checkbox does not imply active behaviour on the part of a website user’ (paragraph 52).

Interestingly, the Court points out that this interpretation of what ‘indication’ means ‘is borne out by Article 7 of Directive 95/46’ (paragraph 53), and in particular Article 7(2) which ‘provides that the data subject’s consent may make such processing lawful provided that the data subject has given his or her consent ‘unambiguously’’ (paragraph 54). So even if the definition of consent in Directive 95/46 does not refer to this condition in particular, the Court nevertheless anchored its main arguments in it.

The Court then made another important interpretation concerning what ‘unambiguous’ consent means: ‘Only active behaviour on the part of the data subject with a view to giving his or her consent may fulfil that requirement’ (paragraph 54). This wording (‘with a view to’) suggests that there is a condition of willfulness, of intent to give consent in order for the indication of consent to be lawful.

In addition, to be even clearer, the Court finds that ‘it would appear impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by not deselecting a pre-ticked checkbox nor, in any event, whether that consent had been informed. It is not inconceivable that a user would not have read the information accompanying the preselected checkbox, or even would not have noticed that checkbox, before continuing with his or her activity on the website visited” (paragraph 55).

A fortiori, it appears impossible in practice to ascertain objectively whether a website user had actually given his or her consent to the processing of his or her personal data by merely continuing with his or her activity on the website visited (continuing browsing or scrolling), nor whether the consent has been informed, provided that the information given to him or her does not even include a pre-ticked checkbox which would at least give the opportunity to uncheck the box. Also, just like the Court points out, it is not inconceivable that a user would not have read the information announcing him or her that by continuing to use the website they give consent.

With these two findings in paragraphs 54 and 55 the Court seems to clarify once and for all that informing users that by continuing their activity on a website signifies consent to placing cookies on their device is not sufficient to obtain valid consent under the ePrivacy Directive read in the light of both Directive 95/46 and the GDPR.

‘Specific’ means consent can’t be inferred from bundled purposes

The following condition that the Court analyzes is that of specificity. In particular, the Court finds that ‘specific’ consent means that ‘it must relate specifically to the processing of the data in question and cannot be inferred from an indication of the data subject’s wishes for other purposes” (paragraph 58). This means that bundled consent will not be considered valid and that consent should be sought granularly for each purpose of processing.

‘Informed’ means being able to determine the consequences of any consent given

One of the questions sent for a preliminary ruling by the German Court concerned specific categories of information that should be disclosed to users in the context of obtaining consent for placing cookies. Article 5(3) of the ePrivacy Directive requires that the user is provided with ‘clear and comprehensive information’ in accordance with Directive 95/46 (now replaced by the GDPR). The question was whether this notice must also include (a) the duration of the operation of cookies and (b) whether or not third parties may have access to those cookies.

The Court clarified that providing ‘clear and comprehensive’ information means ‘that a user is in a position to be able to determine easily the consequences of any consent he or she might give and ensure that the consent given is well informed. It must be clearly comprehensible and sufficiently detailed so as to enable the user to comprehend the functioning of the cookies employed’ (paragraph 74). Therefore, it seems that using language that is easily comprehensible for the user is important, just as it is important painting a full picture of the function of the cookies for which consent is sought.

The Court found specifically with regard to cookies that ‘aim to collect information for advertising purposes’ that ‘the duration of the operation of the cookies and whether or not third parties may have access to those cookies form part of the clear and comprehensive information‘ which must be provided to the user (paragraph 75).

Moreover, the Court adds that ‘information on the duration of the operation of cookies must be regarded as meeting the requirement of fair data processing‘ (paragraph 78). This is remarkable, since the Court doesn’t usually make findings in its data protection case-law with regard to the fairness of processing. Doubling down on its fairness considerations, the Court goes even further and links fairness of the disclosure of the retention time to the fact that ‘a long, or even unlimited, duration means collecting a large amount of information on users’ surfing behaviour and how often they may visit the websites of the organiser of the promotional lottery’s advertising partners’ (paragraph 78).

It is irrelevant if the data accessed by cookies is personal or anonymous, ePrivacy provisions apply regardless

The Court was specifically asked to clarify whether the cookie consent rules in the ePrivacy Directive apply differently depending on the nature of the data being accessed. In other words, does it matter that the data being accessed by cookie is personal or anonymized/aggregated/de-identified?

First of all, the Court points out that in the case at hand, ‘the storage of cookies … amounts to a processing of personal data’ (paragraph 67). That being said, the Court nonetheless notes that the provision analyzed merely refers to ‘information’ and does so ‘without characterizing that information or specifying that it must be personal data’ (paragraph 68).

The Court explained that this general framing of the provision ‘aims to protect the user from interference with his or her private sphere, regardless of whether or not that interference involves personal data’ (paragraph 69). This finding is particularly relevant for the current legislative debate over the revamp of the ePrivacy Directive. It is clear that the core difference between the GDPR framework and the ePrivacy regime is what they protect: the GDPR is concerned with ensuring the protection of personal data and fair data processing whenever personal data is being collected and used, while the ePrivacy framework is concerned with shielding the private sphere of an individual from any unwanted interference. That private sphere/private center of interest may include personal data or not.

The Court further refers to recital 24 of the ePrivacy Directive, which mentions that “any information stored in the terminal equipment of users of electronic communications networks are part of the private sphere of the users requiring protection under the European Convention for the Protection of Human Rights and Fundamental Freedoms. That protection applies to any information stored in such terminal equipment, regardless of whether or not it is personal data, and is intended, in particular, as is clear from that recital, to protect users from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge” (paragraph 70).

Conclusion

The judgment of the CJEU in Planet49 provides some much needed certainty about how the ‘cookie banner’ and ‘cookie consent’ provisions in the ePrivacy Directive should be applied, after years of disparate approaches from national transposition laws and supervisory authorities which lead to a lack of effectiveness in enforcement and, hence, compliance. The judgment does leave open on ardent question: what does ‘freely given consent’ mean? It is important to note nonetheless that before reaching the ‘freely given’ question, any consent obtained for placing cookies (or similar technologies) on user devices will have to meet all of the other three conditions. If only one of them is not met, then that consent is invalid.

***

You can refer to this summary by quoting G. Zanfir-Fortuna, ‘Planet49 CJEU Judgment brings some ‘Cookie Consent’ Certainty to Planet Online Tracking’, http://www.pdpecho.com, published on October 3, 2019.

The CJEU decides lack of access to personal data does not unmake a joint controller: A look at Wirtschaftsakademie

Who is the controller?

The Court of Justice of the EU decided in Case C-210/16 Wirtschaftsakademie that Facebook and the administrator of a fan page created on Facebook are joint controllers under EU data protection law. The decision sent a mini shockwave to organizations that use Facebook Pages, just one week after the GDPR entered into force. What exactly does it mean that they are joint controllers and what exactly do they have to do in order to be compliant? The judgment leaves these questions largely unanswered, but it gives some clues as to finding answers.

Being a joint controller means they have a shared responsibility (with Facebook) to comply with EU data protection law for the processing of personal data occurring through their Facebook Page. As the Court highlighted, they have this responsibility even if they do not have access at all to personal data collected through cookies placed on the devices of visitors of the Facebook page, but just to the aggregated results of the data collection.

The judgment created a great deal of confusion. What has not been yet sufficiently emphasized in the reactions to the Wirtschaftsakademie judgment is that this shared responsibility is not equal: it depends on the stage of the processing the joint controller is involved in and on the actual control it has over the processing. This is, in any case, a better position to be in rather than “controller” on behalf of whom Facebook is processing personal data, or “co-controller” with Facebook. This would have meant full legal liability for complying with data protection obligations for the personal data processed through the page. It is, however, a worse position than being a third party or a recipient that is not involved in any way in establishing purposes and means of the processing. That would have meant there is no legal responsibility for the data being processed through the page. Technically, those were the other options the Court probably looked at before taking the “joint controllership” path.

It is important to note that the Court did not mention at all which are the responsibilities of whom – not even with regard to providing notice. The failure of both Facebook and the page administrator to inform visitors about cookies being placed on their device was the reason invoked by the DPA in the main national proceedings, but the Court remained silent on who is responsible for this obligation.

This summary looks at what the Court found, explaining why it reached its conclusion, and trying to carve out some of the practical consequences of the judgment (also in relation to the GDPR).

This first part of the commentary on the judgment will only cover the findings related to “joint controllership”. The findings related to the competence of the German DPA will be analyzed in a second part. While the judgment interprets Directive 95/46, most of the findings will remain relevant under the GDPR as well, to the extent they interpret identical or very similar provisions of the two laws.

Facts of the Case

Wirtschaftsakademie is an organization that offers educational services and has a Facebook fan page. The Court described that administrators of fan pages can obtain anonymous statistical information available to them free of charge. “That information is collected by means of evidence files (‘cookies’), each containing a unique user code, which are active for two years and are stored by Facebook on the hard disk of the computer or on other media of visitors to fan pages” (#15). The user code “is collected and processed when the fan pages are open” (#15).

The DPA of Schleswig-Holstein ordered Wirtschaftsakademie to close the fan page if it will not be brought to compliance, on the ground that “neither Wirtschaftsakademie, nor Facebook, informed visitors to the Fan Page that Facebook, by means of cookies, collected personal data concerning them and then processed the data” (#16).

The decision of the DPA was challenged by Wirtschaftsakademie, arguing that “it was not responsible under data protection law for the processing of the data by Facebook or the cookies which Facebook installed” (#16). After the DPA lost in lower instances, it appealed these solutions to the Federal Administrative Court, arguing that the main data protection law breach of Wirtschafstakademie was the fact that it commissioned “an inappropriate supplier” because  the supplier “did not comply with data protection law” (#22).

The Federal Administrative Court sent several questions for a preliminary ruling to the CJEU aiming to clarify whether indeed Wirtschaftsakademie had any legal responsibility for the cookies placed by Facebook through its Fan Page and whether the Schleswig Holstein DPA had competence to enforce German data protection law against Facebook, considering that Facebook’s main establishment in the EU is in Ireland and its German presence is only linked to marketing (#24).

“High level of protection” and “effective and complete protection”

The Court starts its analysis by referring again to the aim of the Directive to “ensure a high level of protection of fundamental rights and freedoms, and in particular their right to privacy in respect to processing of personal data” (#26) – and it is to be expected that all analyses under the GDPR would start from the same point. This means that all interpretation of the general data protection law regime will be done in favor of protecting the fundamental rights of data subjects.

Based on the findings in Google Spain, the Court restates that “effective and complete protection of the persons concerned” requires a “broad definition of controller” (#28). Effective and complete protection is another criterion that the Court often takes into account when interpreting data protection law in favor of the individual and his or her rights.

{In fact, one of the afterthoughts of the Court after establishing the administrator is a joint controller, was that “the recognition of joint responsibility of the operator of the social network and the administrator of a fan page hosted on that network in relation to the processing of the personal data of visitors to that page contributes to ensuring more complete protection of the rights of persons visiting a fan page” (#42)}.

The referring Court did not even consider the possibility that the administrator is a controller

Having set up the stage like this, the Court goes on and analyzes the definition of “controller”. To be noted, though, that the referring Court never asked whether the administrator of the fan page is a controller or a joint controller, but asked whether it has any legal responsibility for failing to choose a compliant “operator of its information offering” while being an “entity that does not control the data processing within the meaning of Article 2(d) of Directive 95/46” (#24 question 1).

It seems that the referring Court did not even take into account that the fan page administrator would have any control over the data, but was wondering whether only “controllers” have legal responsibility to comply with data protection law under Directive 95/46, or whether other entities somehow involved in the processing could also have some responsibility.

However, the Court does not exclude the possibility that the administrator may be a controller. First of all, it establishes that processing of personal data is taking place, as described at #15, and that the processing has at least one controller.

Facebook is “primarily” establishing means and purposes of the processing

It recalls the definition of “controller” in Article 2(d) of the Directive and highlights that “the concept does not necessarily refer to a single entity and may concern several actors taking part in that processing, with each of them then being subject to the applicable data protection provisions” (#29). The distribution of responsibilities from the last part of the finding is brought up by the Court without having any such reference in Article 2(d)[1].

This is important, because the next finding of the Court is that, in the present case, “Facebook Ireland must be regarded as primarily determining the purposes and means of processing the personal data of users of Facebook and persons visiting the fan pages hosted on Facebook” (#30). Reading this paragraph together with #29 means that Facebook will have a bigger share of the obligations in a joint controllership situation with fan pages administrators.

This idea is underlined by the following paragraph which refers to identifying the “extent” to which a fan page administrator “contributes… to determining, jointly with Facebook Ireland and Facebook Inc., the purposes and means of processing” (#31). To answer this question, the Court lays out its arguments in three layers:

1) It describes the processing of personal data at issue, mapping the data flows – pointing to the personal data being processed, data subjects and all entities involved:

  • The data processing at issue (placing of cookies on the Fan Page visitors’ device) is “essentially carried out by Facebook” (#33);
  • Facebook “receives, registers and processes” the information stored in the placed cookies not only when a visitor visits the Fan Page, but also when he or she visits services provided by other Facebook family companies and by “other companies that use the Facebook services” (#33);
  • Facebook partners and “even third parties” may use cookies to provide services to Facebook or the business that advertise on Facebook (#33);
  • The creation of a fan page “involves the definition of parameters by the administrator, depending inter alia on the target audience … , which has an influence on the processing of personal data for the purpose of producing statistics based on visits to the fan page” (#36);
  • The administrator can request the “processing of demographic data relating to its target audience, including trends in terms of age, sex, relationship and occupation”, lifestyle, location, online behavior, which tell the administrator where to make special offers and better target the information it offers (#37);
  • The audience statistics compiled by Facebook are transmitted to the administrator “only in anonymized form” (#38);
  • The production of the anonymous statistics “is based on the prior collection, by means of cookies installed by Facebook …, and the processing of personal data of (the fan page) visitors for such statistical purposes” (#38);

2) It identifies the purposes of this processing:

  • There are two purposes of the processing:
    • “to enable Facebook to improve its system of advertising transmitted via its network” and
    • “to enable the fan page administrator to obtain statistics produced by Facebook from the visits of the page”, which is useful for “managing the promotion of its activity and making it aware of the profiles of the visitors who like its fan page or use its applications, so that it can offer them more relevant content” (#34);

3) It establishes a connection between the two entities that define the two purposes of processing:

  • Creating a fan page “gives Facebook the opportunity to place cookies on the computer or other device of a person visiting its fan page, whether or not that person has a Facebook account” (#35);
  • The administrator may “define the criteria in accordance with which the statistics are to be drawn up and even designate the categories of persons whose personal data is to be made use of by Facebook”, “with the help of filters made available by Facebook” (#36);
  • Therefore, the administrator “contributes to the processing of the personal data of visitors to its page” (#36);

One key point: not all joint controllers must have access to the personal data being processed

In what is the most impactful finding of this judgment, the Court uses one of the old general principles of interpreting and applying the law, ubi lex non distinguit, nec nos distinguere debemus, and it states that “Directive 95/46 does not, where several operators are jointly responsible for the same processing, require each of them to have access to the personal data concerned” (#38). Therefore, the fact that administrators have access only to anonymized data will have no impact upon the existence of their legal responsibility as joint controllers, since the criteria that matters is establishing purposes and means of the processing and that at least one of the entities involved in the processing has access to and is processing personal data. The fact that they only have access to anonymized data should nonetheless matter when establishing the degree of responsibility.

Hence, after describing the involvement of fan page administrators in the processing at issue – and in particular their role in defining parameters for processing depending on their target audience and in the determination of the purposes of the processing, the Court finds that “the administrator must be categorized, in the present case, as a controller responsible for that processing within the European Union, jointly with Facebook Ireland” (#39).

Enhanced responsibility for non-users visiting the page

The Court also made the point that fan pages can be visited by non-users of Facebook, implying that were it not for the existence of that specific fan page they accessed because they were looking for information related to the administrator of the page, Facebook would not be able to place cookies on their devices and process personal data related to them for its own purposes and for the purposes of the fan page. “In that case, the fan page responsibility for the processing of the personal data of those persons appears to be even greater, as the mere consultation of the home page by visitors automatically starts the processing of their personal data” (#42).

Jointly responsible, not equally responsible

Finally, after establishing that there is joint controllership and joint responsibility, the Court makes the very important point that the responsibility is not equal and it depends on the degree of involvement of the joint controller in the processing activity:

The existence of joint responsibility does not necessarily imply equal responsibility of the various operators involved in the processing of personal data. On the contrary, those operators may be involved at different stages of that processing of personal data and to different degrees, so that the level of responsibility of each of them must be assessed with regard to all the relevant circumstances of the particular case(#43).

Comments and conclusions

In the present case, the Court found early in the judgment that Facebook “primarily” establishes the means and purposes of the processing. This means that it is primarily responsible for compliance with data protection obligations. At the same time, the administrator of the fan page has responsibility to comply with some data protection provisions, as joint controller. The Court did not clarify, however, what exactly the administrator of the fan page must do in order to be compliant.

For instance, the Court does not go into analyzing how the administrator complies or not with the Directive in this case – therefore, assuming that the judgment requires administrators to provide data protection notice is wrong. The lack of notice was a finding of the DPA in the initial proceedings. Moreover, the DPA ordered Wirtschaftsakademie to close its Facebook page because it found that neither Facebook, nor the page administrator had informed visitors about the cookies being placed on their devices (#16).

The CJEU merely establishes that the administrator is a joint controller and that it shares responsibility for compliance with Facebook depending on the degree of their involvement in the processing.

The only clear message from the Court with regard to the extent of legal responsibility of the administrator as joint controller is that it has enhanced responsibility towards visitors of the fan page that are not Facebook users. This being said, it is very likely that informing data subjects is one of the obligations of the GDPR that can potentially fall on the shoulders of fan page administrators in the absence of Facebook stepping up and providing notice, since they can edit the interface with visitors to a certain extent.

Another message that is not so clear, but can be extracted from the judgment is that the degree of responsibility of the joint controllers “must be assessed with regard to all the relevant circumstances of the particular case” (#43). This could mean that if the two joint controllers were to enter a joint controllership agreement (as the GDPR now requires), the Courts and DPAs may be called to actually look at the reality of the processing in order to determine the responsibilities each of them has, in order to avoid a situation where the joint controller primarily responsible for establishing means and purposes contractually distributes obligations to the other joint controller that the latter could not possibly comply with.

As for the relevance of these findings under the GDPR, all the “joint controllership” part of the judgment is very likely to remain relevant, considering that the language the Court interpreted from Directive 95/46 is very similar to the language used in the GDPR (see Article 2(d) of the Directive and Article 4(7) GDPR). However, the GDPR does add a level of complexity to the situation of joint controllers, in Article 26. The Court could, eventually, add to this jurisprudence an analysis of the extent to which the joint controllership agreement required by Article 26 is relevant to establish the level of responsibility of a joint controller.

Given that the GDPR requires joint controllers to determine in a transparent manner their respective responsibilities for compliance through an arrangement, one consequence of the judgment is that such an arrangement should be concluded between Facebook and fan page administrators (Article 26(1) GDPR). The essence of the arrangement must then be made available to visitors of fan pages (Article 26(2) GDPR).

However, there is one obligation under the GDPR that, when read together with the findings of the Court, results in a conundrum. Article 26(3) GDPR provides that the data subject may exercise his or her rights “in respect of and against each of the controller”, regardless of how the responsibility is shared contractually between them. In the case at hand, the Court acknowledges that the administrator only has access to anonymized data. This means that even if data subjects would make, for example, a request for access or erasure of data to the administrator, it will not be in a position to solve such requests. A possibility is that any requests made to a joint controller that does not have access to data will be forwarded by the latter to the joint controller that does have access (what is important is that the data subject has a point of contact and eventually someone they can claim their rights to). This is yet another reason why a written agreement to establish the responsibility of each joint controller is useful. Practice will solve the conundrum, ultimately, with DPAs and national Courts likely playing their part.

 

 

 

[1] “(d) ‘controller’ shall mean the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; where the purposes and means of processing are determined by national or Community laws or regulations, the controller or the specific criteria for his nomination may be designated by national or Community law;”

Brief case-law companion for the GDPR professional

This collection of quotes from relevant case-law has been compiled with the purpose of being useful to all those working with EU data protection law. The majority of the selected findings are part of a “Countdown to the GDPR” I conducted on social media, one month before the Regulation became applicable, under #KnowYourCaseLaw. This exercise was prompted by a couple of reasons.

First, data protection in the EU is much older and wider than the General Data Protection Regulation (GDPR) and it has already invited the highest Courts in Europe to weigh in on the protection of this right. Knowing what those Courts have said is essential.

Data protection law in the EU is not only a matter of pure EU law, but also a matter of protecting human rights following the legal framework of the Council of Europe (starting with Article 8 of the European Convention on Human Rights – ‘ECHR’). The interplay between these two legal regimes is very important, given the fact that the EU recognizes fundamental rights protected by the ECHR as general principles of EU law – see Article 6(3) TEU.

Finally, knowing relevant case-law makes the difference between a good privacy professional and a great one.

What to expect

This is not a comprehensive collection of case-law and it does not provide background for the cases it addresses. The Handbook of data protection law, edition 2018, is a great resource if this is what you are looking for.

This is a collection of specific findings of the Court of Justice of the EU (CJEU), the European Court of Human Rights (ECtHR) and one bonus finding of the German Constitutional Court. There are certainly other interesting findings that have not been included here (how about an “Encyclopedia of interesting findings” for the next project?). The ones that have been included provide insight into specific issues, such as the definition of personal data, what constitutes data related to health, what does freely consent mean or what type of interference with fundamental rights is profiling. Readers will even find a quote from a concurring opinion of an ECtHR judge that is prescient, to say the least.

Enjoy the read!

Brief Case-Law Companion for the GDPR Professional

Exam scripts are partly personal data and other practical findings of the CJEU in Nowak

The Court of Justice of the European Union (CJEU) gave its judgment in Case C-434/16 Nowak on 20 December 2017, and it is significant from several points of view:

  • It provides a good summarized description of what constitutes “personal data”, referring to both objective and subjective information, regardless of its sensitivity, and it also details what the “related to” criterion from the legal definition of personal data means;
  • It *almost* departs from its YS jurisprudence on the concept of personal data;
  • It applies the interpretation that the Article 29 Working Party gave to the “related to” criterion in its Opinion on personal data from 2007, highlighting thus the weight that the interpretation of data protection law given by the European DPAs might have;
  • It establishes that written answers submitted by a candidate during an exam are personal data of the candidate (this is relevant for all education services providers);
  • It also establishes that the questions of the exam do not fall in the category of “personal data” – hence, not the entire exam script is considered personal data, but only the answers submitted by the candidate;
  • It establishes that the comments reviewers make on the margins of one’s written answers to an exam are personal data of the person being examined, while also being personal data of the reviewer;
  • It establishes that exam scripts should only be kept in an identifiable form only as long as they can be challenged.

This comment looks closer at all of these findings.

Facts of the Case

Mr Nowak was a trainee accountant who requested access to his exam script from the Institute of Chartered Accountants of Ireland (CAI), after failing the examination. He first challenged the results of the exam with no success. He then submitted a subject access request to the CAI, asking to receive a copy of all his personal data held by the CAI. He obtained 17 documents, but the exam script was not among them.

Mr Nowak brought this to the attention of the Irish Data Protection Commissioner (DPC) through an email, arguing that his exam script was also his personal data. The DPC  answered by email that exam scripts “would not generally constitute personal data”. Mr Nowak submitted then a formal complaint with the DPC against the CAI. The official response of the DPC was to reject the complaint on the ground that it is “frivolous or vexatious” (the same reason used to reject the first complaint of Max Schrems challenging the EU-US Safe Harbor scheme).

Mr Nowak then challenged this decision of the Irish DPC in front of the Circuit Court, then the High Court and then the Court of Appeal, which all decided against him. Finally, he challenged the decision of the Court of Appeal at the Supreme Court who decided to stay proceedings and send questions for a preliminary ruling to the CJEU, since the case required interpretation of EU law – in particular, how should the concept of “personal data” as provided for by EU Directive 95/46 be interpreted (a small procedural reminder here: Courts of last instance are under an obligation to send questions for a preliminary ruling to the CJEU in all cases that require the interpretation of EU law, per Article 267 TFEU last paragraph).

Questions referred

The Supreme Court asked the CJEU two questions (in summary):

  1. Is information recorded in/as answers given by an exam candidate capable of being personal data?
  2. If this is the case, then what factors are relevant in determining whether in a given case such information is personal data?

Pseudonymised data is personal data

First, recalling its Breyer jurisprudence, the Court establishes that, for information to be treated as personal data, it is of no relevance whether all the information enabling the identification of the data subject is in the hands of one person or whether the identifiers are separated (§31). In this particular case, it is not relevant “whether the examiner can or cannot identify the candidate at the time when he/she is correcting and marking the examination script” (§30).

The Court then looks at the definition of personal data from Directive 95/46, underlying that it has two elements: “any information” and “related to an identified or identifiable natural person”.

“Any information” means literally any information, be it objective or subjective

The Court recalls that the scope of Directive 95/46 is “very wide and the personal data covered … is varied” (§33).

“The use of the expression ‘any information’ in the definition of the concept of ‘personal data’ … reflects the aim of the EU legislature to assign a wide scope to that concept, which is not restricted to information that is sensitive or private, but potentially encompasses all kinds of information, not only objective but also subjective, in the form of opinions and assessments, provided that it ‘relates’ to the data subject.” (§34)

Save this paragraph, as it is a new jurisprudential source of describing what constitutes personal data – it is certainly a good summary, in line with the Court’s previous case-law (see an excellent overview of the Court’s approach to the definition of personal data here, p. 40 – 41). It makes clear that, for instance, comments on social media, reviews of products/companies, ratings and any other subjective assessments are personal data, as long as they relate to an identified or identifiable individual. This is also true for any sort of objective information (think shoe number), regardless of whether it is sensitive or private, as long as it relates to an identified or identifiable individual.

“Related to” must be judged in relation to “content, purpose or effect/consequences”

The condition for any information to be considered personal data is that it relates to a natural person. According to the Court, this means that “by reason of its content, purpose or effect, (it) is linked to a particular person” (§35). The Court thus applies the test developed by the Article 29 Working Party in its 2007 Opinion on the concept of personal data. Ten years ago, the DPAs wrote that “in order to consider that the data ‘relate’ to an individual, a ‘content’ element OR a ‘purpose’ element OR a ‘result’ element should be present” (2007 Opinion, p. 10).

The Court now adopted this test in its case-law, giving an indication of how important the common interpretation given by data protection authorities in official guidance is. However, the Court does not directly refer to the Opinion.

Applying the test to the facts of the case, the Court showed that the content of exam answers “reflects the extent of the candidate’s knowledge and competence in a given field and, in some cases, his intellect, thought processes, and judgment” (§37). Additionally, following AG Kokott’s Opinion, the Court also pointed out that “in the case of a handwritten script, the answers contain, in addition, information as to his handwriting” (§37).

The purpose of the answers is “to evaluate the candidate’s professional abilities and his suitability to practice the profession concerned” (§38) and the consequence of the answers “is liable to have an effect on his or her rights and interests, in that it may determine or influence, for example, the chance of entering the profession aspired to or of obtaining the post sought” (§39).

Comments of reviewers are two times personal data

The test is then applied to the comments of reviewers on the margin of a candidate’s answers. The Court showed that “The content of those comments reflects the opinion or the assessment of the examiner of the individual performance of the candidate in the examination, particularly of his or her knowledge and competences in the field concerned. The purpose of those comments is, moreover, precisely to record the evaluation by the examiner of the candidate’s performance, and those comments are liable to have effects for the candidate” (§43).

It is important to note here that complying with only one of the three criteria (content, purpose, effects) is enough to qualify information as “relating to” an individuals, even if the Court found in this particular case that all of them are met. This is shown by the us of “or” in the enumeration made in §35, as shown above.

The Court also found that “the same information may relate to a number of individuals and may constitute for each of them, provided that those persons are identified or identifiable, personal data” (§45), having regard to the fact that the comments of the examiners are personal data of both the examiners and the “examinee”.

Information can be Personal data regardless of whether one is able to rectify it or not

It was the Irish DPC that argued that qualifying information as “personal data” should be affected by the fact that the consequence of that classification is, in principle, that the candidate has rights of access and rectification (§46). The logic here was that if data cannot be rectified, it cannot be considered personal – just as exam answers cannot be rectified after the exam finished.

The Court (rightfully so) disagreed with this claim, following the opinion of the Advocate General and contradicting its own findings in Case C-141/12 YS (see a more detailed analysis of the interaction between the two judgments below). It argued that “a number of principles and safeguards, provided for by Directive 95/46, are attached to that classification and follow from that classification” (§47), meaning that protecting personal data goes far beyond the ability to access and rectify your data. This finding is followed by a summary of the fundamental mechanisms encompassed by data protection.

Data protection is a web of safeguards, accountability and individual rights

Starting from recital 25 of Directive 95/46 (yet again, how important recitals are! Think here of Recital 4 of the GDPR and the role it can play in future cases – “The processing of personal data should be designed to serve mankind”), the Court stated that:

“…the principles of protection provided for by that directive are reflected, on the one hand, in the obligations imposed on those responsible for processing data, obligations which concern in particular data quality, technical security, notification to the supervisory authority, and the circumstances under which processing can be carried out, and, on the other hand, in the rights conferred on individuals, the data on whom are the subject of processing, to be informed that processing is taking place, to consult the data, to request corrections and even to object to processing in certain circumstances” (§48).

The Court thus looks at data protection as a web of accountability, safeguards (reflected in technical security measures, data quality, conditions for lawful processing data) and rights conferred to the individuals.

In this case, not considering exam answers personal data just because they cannot be “corrected” after the exam would strip this information from the other web of protections, such as being processed on a legitimate ground, being retained only for the necessary period of time and so on. The Court does not phrase this finding this way, but it states that:

“Accordingly, if information relating to a candidate, contained in his or her answers submitted at a professional examination and in the comments made by the examiner with respect to those answers, were not to be classified as ‘personal data’, that would have the effect of entirely excluding that information from the obligation to comply not only with the principles and safeguards that must be observed in the area of personal data protection, and, in particular, the principles relating to the quality of such data and the criteria for making data processing legitimate, established in Articles 6 and 7 of Directive 95/46, but also with the rights of access, rectification and objection of the data subject, provided for in Articles 12 and 14 of that directive, and with the supervision exercised by the supervisory authority under Article 28 of that directive” (§49).

Furthermore, the Court shows that errors in the answers given to an exam do not constitute “inaccuracy” of personal data, because the level of knowledge of a candidate is revealed precisely by the errors in his or her answers, and revealing the level of knowledge is the purpose of this particular data processing. As the Court explains, “[i]t is apparent from Article 6(1)(d) of Directive 95/46 that the assessment of whether personal data is accurate and complete must be made in the light of the purpose for which that data was collected” (§53).

Exam scripts should only be kept in an identifiable form as long as they can be challenged

The Court further explained that both exam answers and reviewers’ comments can nevertheless be subject to “inaccuracy” in a data protection sense, “for example due to the fact that, by mistake, the examination scripts were mixed up in such a way that the answers of another candidate were ascribed to the candidate concerned, or that some of the cover sheets containing the answers of that candidate are lost, so that those answers are incomplete, or that any comments made by an examiner do not accurately record the examiner’s evaluation of the answers of the candidate concerned” (§54).

Also, the Court also admitted the possibility that “a candidate may, under Article 12(b) of Directive 95/46, have the right to ask the data controller to ensure that his examination answers and the examiner’s comments with respect to them are, after a certain period of time, erased, that is to say, destroyed” (§55).

Another finding of the Court that will be useful to schools, universities and other educational institutions is that keeping exam scripts related to an identifiable individual is not necessary anymore after the examination procedure is closed and can no longer be challenged: “Taking into consideration the purpose of the answers submitted by an examination candidate and of the examiner’s comments with respect to those answers, their retention in a form permitting the identification of the candidate is, a priori, no longer necessary as soon as the examination procedure is finally closed and can no longer be challenged, so that those answers and comments have lost any probative value” (§55).

The Court distances itself from the findings in C-141/12 YS, but still wants to keep that jurisprudence alive

One of the biggest questions surrounding the judgment in Nowak was whether the Court will follow AG’s Opinion and change it’s jurisprudence from C-141/12 YS.  In that judgment, the Court found that the legal analysis used by the Dutch Ministry of Immigration in a specific case of asylum seekers is not personal data, and the main reason invoked was that “[i]n contrast to the data relating to the applicant for a residence permit which is in the minute and which may constitute the factual basis of the legal analysis contained therein, such an analysis … is not in itself liable to be the subject of a check of its accuracy by that applicant and a rectification under Article 12(b) of Directive 95/46” (§45).

The Court further noted: “In those circumstances, extending the right of access of the applicant for a residence permit to that legal analysis would not in fact serve the directive’s purpose of guaranteeing the protection of the applicant’s right to privacy with regard to the processing of data relating to him, but would serve the purpose of guaranteeing him a right of access to administrative documents, which is not however covered by Directive 95/46.” Finally, the finding was that “[i]t follows from all the foregoing considerations … that the data relating to the applicant for a residence permit contained in the minute and, where relevant, the data in the legal analysis contained in the minute are ‘personal data’ within the meaning of that provision, whereas, by contrast, that analysis cannot in itself be so classified” (§48).

Essentially, in YS the Court linked the ability of accessing and correcting personal data with the classification of information as personal data, finding that if the information cannot be corrected, then it cannot be accessed and it cannot be classified as personal data.

By contrast, following AG Kokott’s analysis, in Nowak the Court essentially states that classifying information as personal data must not be affected by the existence of the rights to access and rectification – in the sense that the possibility to effectively invoke them should not play a role in establishing that certain information is or is not personal data: “the question whether written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers should be classified as personal data cannot be affected … by the fact that the consequence of that classification is, in principle, that the candidate has rights of access and rectification, pursuant to Article 12(a) and (b) of Directive 95/46” (§46).

However, the Court is certainly not ready to fully change its jurisprudence established in YS, and even refers to its judgment in YS in a couple of paragraphs. In the last paragraphs of Nowak, the Court links the ability to correct or erase data to the existence of the right of accessing that data (but not to classifying information as personal data).

The Court states that: “In so far as the written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers are therefore liable to be checked for, in particular, their accuracy and the need for their retention… and may be subject to rectification or erasure…, the Court must hold that to give a candidate a right of access to those answers and to those comments… serves the purpose of that directive of guaranteeing the protection of that candidate’s right to privacy with regard to the processing of data relating to him (see, a contrario, judgment of 17 July 2014, YS and Others, C‑141/12 and C‑372/12, EU:C:2014:2081, paragraphs 45 and 46), irrespective of whether that candidate does or does not also have such a right of access under the national legislation applicable to the examination procedure”.

After previously showing an ever deeper understanding of data protection in its Nowak judgment, the Court sticks to some of its findings from YS, even if this meant perpetuating a confusion between the fundamental right to respect for private life and the fundamental right to the protection of personal data: “it must be recalled that the protection of the fundamental right to respect for private life means, inter alia, that any individual may be certain that the personal data relating to him is correct and that it is processed in a lawful manner” (§57 in Nowak and §44 in YS). Lawful processing of personal data and the right to keep personal data accurate are, in fact, enshrined in Article 8 of the EU Charter – the right to the protection of personal data, and not in Article 7 – the right to respect for private life.

Obiter dictum 1: the curious insertion of “exam questions” in the equation

The Court also does something curious in these last paragraphs. It simply states, after the paragraphs sending to the YS judgment, that “the rights of access and rectification, under Article 12(a) and (b) of Directive 95/46, do not extend to the examination questions, which do not as such constitute the candidate’s personal data” (§58). The national court did not ask about this specific point. AG Kokott also does not address this issue at all in her Opinion. This might have been raised during the hearing, but no context is provided to it. The Court simply states that “Last, it must be said…” and follows it with the finding regarding test questions.

While it is easy to see that questions of a specific test, by themselves, are not personal data, as they do not relate with regard to their content, purpose or effect to a specific individual, the situation is not as clear when the questions are part of the “solved” exam sheet of a specific candidate. The question is: “Are the answers of the test inextricably linked to the questions?” Imagine a multiple choice test, where the candidate only gains access to his/her answers, without obtaining access to the questions of that test. Accessing the answers would be unintelligible. For instance, EPSO candidates have been trying for years to access their own exam sheets held by the EPSO agency of the European Union, with no success. This is exactly because EPSO only provides access to the series of letters chosen as answers from the multiple choice test. Challenges of this practice have all failed, including those brought to the attention of the former Civil Service Tribunal of the CJEU (see this case, for example). This particular finding in Nowak closes the barely opened door for EPSO candidates to finally have access to their whole test sheet.

Obiter dictum 2: reminding Member States they can restrict the right of access

With an apparent reason and referring to the GDPR, the CJEU recalls, as another obiter dictum, under the same “it must be said” (§58 and §59), that both Directive 95/46 and the GDPR “provide for certain restrictions of those rights” (§59) – access, erasure etc.

It also specifically refers to grounds that can be invoked by Member States when limiting the right to access under the GDPR: when such a restriction constitutes a necessary measure to safeguard the rights and freedoms of others (§60,§61), or if it is done for other objectives of general public interest of the Union or of a Member State (§61).

These findings are not followed by any other considerations, as the Court concludes with a finding that had already been reached around §50: “the answer to the questions referred is that Article 2(a) of Directive 95/46 must be interpreted as meaning that, in circumstances such as those of the main proceedings, the written answers submitted by a candidate at a professional examination and any comments made by an examiner with respect to those answers constitute personal data, within the meaning of that provision” (§62).

If you want to have a look at a summary of AG Kokott’s excellent Conclusions in this case and then compare them to the judgment of the Court, click here. The Court did follow the Conclusions to a great extent.

 

A Conversation with Giovanni Buttarelli about The Future of Data Protection: setting the stage for an EU Digital Regulator

The nature of the digital economy is as such that it will force the creation of multi-competent supervisory authorities sooner rather than later. What if the European Data Protection Board would become in the next 10 to 15 years an EU Digital Regulator, looking at matters concerning data protection, consumer protection and competition law, having “personal data” as common thread? This is the vision Giovanni Buttarelli, the European Data Protection Supervisor, laid out last week in a conversation we had at the IAPP Data Protection Congress in Brussels.

The conversation was a one hour session in front of an over-crowded room in The Arc, a cozy amphitheater-like venue inducing bold ideas being expressed in a stimulating exchange.

To begin with, I reminded the Supervisor that at the very beginning of his mandate, in early 2015, he published the 5-year strategy of the EDPS. At that time the GDPR wasn’t adopted yet and the Internet of Things was taking off. Big Data had been a big thing for a while and questions about the feasibility and effectiveness of a legal regime that is centered around each data item that can be traced back to an individual were popping up. The Supervisor wrote in his Strategy that the benefits brought by new technologies should not happen at the expense of the fundamental rights of individuals and their dignity in the digital society.

Big data will need equally  big data protection, he wrote then, suggesting thus that the answer to Big Data is not less data protection, but enhanced data protection.

I asked the Supervisor if he thinks that the GDPR is the “big data protection” he was expecting or whether we need something more than what the GDPR provides for. And the answer was that “the GDPR is only one piece of the puzzle”. Another piece of the puzzle will be the ePrivacy reform, and another one will be the reform of the regulation that provides data protection rules for the EU institutions and that creates the legal basis for the functioning of the EDPS. I also understood from our exchange that a big part of the puzzle will be effective enforcement of these rules.

The curious fate of the European Data Protection Board

One centerpiece of enforcement is the future European Data Protection Board, which is currently being set up in Brussels so as to be functional on 25 May 2018, when the GDPR becomes applicable. The European Data Protection Board will be a unique EU body, as it will have a European nature, being funded by the EU budget, but it will be composed of commissioners from national data protection authorities who will adopt decisions, that will rely for the day-to-day activity on a European Secretariat. The Secretariat of the Board will be ensured by dedicated staff of the European Data Protection Supervisor.

The Supervisor told the audience that he either already hired or plans to hire a total of “17 geeks” adding to his staff, most of whom will be part of the European Data Protection Board Secretariat. The EDPB will be functional from Day 1 and, apparently, there are plans for some sort of inauguration of the EDPB celebrated at midnight on the 24th to the 25th of May next year.

These are my thoughts here: the nature of the EDPB is as unique as the nature of the EU (those of you who studied EU Law certainly remember from the law school days how we were told that the EU is a sui generis type of economical and political organisation). In fact, the EDPB may very well serve as test model for ensuring supervision and enforcement of other EU policy areas. The European Commission could test the waters to see whether such a mixt national/European enforcement mechanism is feasible.

There is a lot of pressure on effective enforcement when it comes to the GDPR. We dwelled on enforcement, and one question that inevitably appeared was about the trend that starts to shape up in Europe, of having competition authorities and consumer protection authorities engaging in investigations together with, or in parallel with data protection authorities (see herehere and here).

It’s time for a big change, and time for the EU to have a global approach, the Supervisor said. And a change that will require some legislative action. “I’m not saying we will need an European FTC (US Federal Trade Commission – n), but we will need a Digital EU Regulator“, he added. This Digital Regulator would have the powers to also look into competition and consumer protection issues raised by processing of personal data (so, therefore, in addition to data protection issues). Acknowledging that these days there is a legislative fatigue in Brussels surrounding privacy and data protection, the Supervisor said he will not bring this idea to the attention of the EU legislator right now. But he certainly plans to do so, maybe even as soon as next year. The Supervisor thinks that the EDPB could morph into this kind of Digital Regulator sometime in the future.

The interplay among these three fields of law has been on the Supervisor’s mind for some time now. The EDPS issued four Opinions already that set the stage for this proposal – See Preliminary Opinion on “Privacy and competitiveness in the age of Big Data: the interplay between data protection, competition law and consumer protection in the digital economy“, Opinion 4/2015 “Towards a new digital ethics“, Opinion 7/2015 “Meeting the Challenges of Big Data“, and finally Opinion 8/2016 on “coherent enforcement of fundamental rights in the age of Big Data“. So this is certainly something the data protection bubble should keep their eyes on.

Enhanced global enforcement initiatives

Another question that had to be asked on enforcement was whether we should expect more concentrated and coordinated action of privacy commissioners on a global scale, in GPEN-like structures. The Supervisor revealed that the privacy commissioners that meet for the annual International Conference are “trying to complete an exercise about our future”. They are currently analyzing the idea of creating an entity with legal personality that will look into global enforcement cases.

Ethics comes on top of legal compliance

Another topic the conversation went to was “ethics”. The EDPS has been on the forefront of including the ethics approach in privacy and data protection law debates, by creating the Ethics Advisory Group at the beginning of 2016. I asked the Supervisor whether there is a danger that, by bringing such a volatile concept into the realm of data protection, companies would look at this as an opportunity to circumvent strict compliance and rely on sufficient self-assessments that their uses of data are ethical.

“Ethics comes on top of data protection law implementation”, the Supervisor explained. According to my understanding, ethics is brought into the data protection realm only after a controller or processor is already compliant with the law and, if they have to take equally legal decisions, they should rely on ethics to take the right decision.

We did discuss about other things during this session, including the 2018 International Conference of Privacy Commissioners that will take place in Brussels, and the Supervisor received some interesting questions from the public at the end, including about the Privacy Shield. But a blog can only be this long.

 

Note: The Supervisor’s quotes are so short in this blog because, as the moderator, I did my best to follow the discussion and steer it rather than take notes. So the quotes come from the brief notes I managed to take during this conversion.

Why did Facebook just receive (one of) the biggest data protection fine(s) on record

The Spanish Data Protection Authority announced today that they fined Facebook with 1,2 million euro for several breaches of the Spanish Data Protection Law. Here’s a brief note in English from Politico.eu and the full press release of the Spanish DPA (in ES).

As per my knowledge, this is the biggest fine issued by a Data Protection Authority in Europe for breaches of data protection law (as always, please correct me in comments below and I will make the changes. UPDATE: It’s worth noting that the Italian Garante, in an investigation conducted in conjunction with Guarda de Finanza – a specialised body inquiring financial criminal conduct, issued in February this year a total sum fine of 5.8 mil euro to a company that was transferring money from Italy to China on behalf of persons without their knowledge, which also meant that they were processing personal data without consent. The total sum fine was reached by adding fines for unlawfully processing data of every person affected).

According to the press release, the Spanish DPA found two “serious breaches” and one “very serious breach” of the Spanish Data Protection Law. This investigation is a part of a joint initiative of a Contact Group composed of the DPAs from Belgium, France, Hamburg and The Netherlands.

So what prompted this record fine?

According to the press release (Please note that all quotes are unofficial translation, made by me, so they must not be relied on for legal advice. UPDATE: An official press release is now available in English):

  • Personal data on political views, religious beliefs, sex, personal preferences or location data are collected directly, via mere interaction of the data subject with Facebook services or with third-party webpages, without clearly informing the user about the use and the purposes of collecting this data.
  • Facebook does not obtain unequivocal consent, specific and informed, from users to process their data, because it does not properly inform data subjects.

Each of the serious breach was fined with 300.000 EUR and the very serious breach was fined with 600.000 EUR.

The very serious breach was that “the social network processes special categories of data for marketing purposes, among others, without obtaining explicit consent of users, as requested by the data protection law”.

“The investigation allowed to prove that Facebook does not inform users in an exhaustive and clear manner about the data that they are going to collect and the processing operations they are going to engage in with that data, limiting themselves to only giving some examples. In particular, the social network collects other data derived from the interaction carried out by users, both on the platform itself and on third-party websites, without them being able to clearly perceive the data that Facebook collects about them, or the purposes for which the data is collected”, according to the press release.

The DPA also took into account that “users are not informed on how their data are processed through the use of cookies – some of them used exclusively for marketing purposes and some of them used for a purpose that the company categorised as “secret”, when they are accessing web pages that are not of the company but that contain the “Like” button”. The DPA mentions as well the situation of users that are not registered with the social platform, but visit at one point one of the platform’s pages – their data is also retained by the social network.

The DPA also found that “the privacy policy contains general formulations that are not clear, and it obliges the user to access a multitude of links to be able to read it”. On one hand, the DPA notes, a Facebook user with an average knowledge of how new technology works is not able to acknowledge to the full extent the collection of data, how it’s subsequently used, or why it is used. On the other hand, the non-users are not at all able to be aware of how they’re data is used.

Finally, the DPA also referred to the fact they were able to prove that Facebook does not delete data that it collects on the basis of online browsing habits of users, retaining it and reutilising it associated with the same user. “Concerning data retention, when a user deletes their account and asks for deletion of data, Facebook retains and processes data for another 17 months through a cookie. This is why the DPA considers that the personal data of users are not completely deleted neither when they stop being necessary for the purposes they were collected, nor when the user explicitly require their deletion“.

This decision comes to show, yet again, how important transparency is towards the data subject! As you will also see soon in my commentary of the Barbulescu v Romania judgment of the ECHR Grand Chamber of last week, correctly and fully informing the data subject is key to data protection compliance.

 

Exam scripts and examiner’s corrections are personal data of the exam candidate (AG Kokott Opinion in Nowak)

AG Kokott delivered her Opinion on 20 July in Case C-434/16 Nowak v Data Protection Commissioner, concluding that “a handwritten examination script capable of being ascribed to an examination candidate, including any corrections made by examiners that it may contain, constitutes personal data within the meaning of Article 2(a) of Directive 95/46/EC” (Note: all highlights in this post are mine).
This is a really exciting Opinion because it provides insight into:

  • the definition of personal data,
  • the purpose and the functionality of the rights of the data subject,
  • the idea of abusing data protection related rights for non-data protection purposes,
  • how the same ‘data item’ can be personal data of two distinct data subjects (examiners and examinees),
  • what constitutes a “filing system” of personal data processed otherwise than by automated means.

But also because it technically (even if not literally) invites the Court to change its case-law on the definition of personal data, and specifically the finding that information consisting in a legal assessment of facts related to an individual does not qualify as personal data (see C-141/12 and C-372/12 YS and Others).

The proceedings were initially brought in front of the Irish Courts by Mr Nowak, who, after failing an exam organised by a professional association of accountants (CAI) four times, asked for access to see his exam sheet on the basis of the right to access his own personal data. Mr Nowak submitted a request to access all his personal data held by CAI and received 17 items, none of which was the exam sheet. He then submitted a complaint to the Irish Data Protection Commissioner, who decided not to investigate it, arguing that an exam sheet is not personal data. The decision not to investigate on this ground was challenged in front of a Court. Once the case reached the Irish Supreme Court, it was referred to the Court of Justice of the EU to clarify whether an exam sheet falls under the definition of “personal data” (§9 to §14).

Analysis relevant both for Directive 95/46 and for the GDPR

Yet again, AG Kokott refers to the GDPR in her Conclusions, clarifying that “although the Data Protection Directive will shortly be repealed by the General Data Protection Regulation, which is not yet applicable, the latter will not affect the concept of personal data. Therefore, this request for a preliminary ruling is also of importance for the future application of the EU’s data protection legislation” (m.h.).

The nature of an exam paper is “strictly personal and individual”

First, the AG observes that “the scope of the Data Protection Directive is very wide and the personal data covered by the Directive is varied” (§18).

The Irish DPC argued that an exam script is not personal data because “examination exercises are normally formulated in abstract terms or relate to hypothetical situations”, which means that “answers to them are not liable to contain any information relating to an identified or identifiable individual” (§19).

This view was not followed by the AG, who explained that it is incongruent with the purpose of an exam. “In every case“, she wrote, “the aim of an examination — as opposed, for example, to a representative survey — is not to obtain information that is independent of an individual. Rather, it is intended to identify and record the performance of a particular individual, i.e. the examination candidate”  (§24; m.h.). Therefore, “every examination aims to determine the strictly personal and individual performance of an examination candidate. There is a good reason why the unjustified use in examinations of work that is not one’s own is severely punished as attempted deception” (§24; m.h.).

What about exam papers identified by codes?

In a clear indication that pseudonymized data are personal data, the AG further noted that an exam script is personal data also in those cases where instead of bearing the examination candidate’s name, the script has an identification number or bar code: “Under Article 2(a) of the Data Protection Directive, it is sufficient for the existence of personal information that the data subject may at least be indirectly identified. Thus, at least where the examination candidate asks for the script from the organisation that held the examination, that organisation can identify him by means of the identification number” (§28).

Characteristics of handwriting, personal data themselves 

The AG accepted the argument of Mr Nowak that answers to an exam that are handwritten “contain additional information about the examination candidate, namely about his handwriting” (&29). Therefore, the characteristics of the handwriting are personal data themselves. The AG explains that “a script that is handwritten is thus, in practice, a handwriting sample that could at least potentially be used at a later date as evidence to determine whether another text was also written in the examination candidate’s writing. It may thus provide indications of the identity of the author of the script” (§29). According to the AG, it’s not relevant whether such a handwriting sample is a suitable means of identifying the writer beyond doubt: “Many other items of personal data are equally incapable, in isolation, of allowing the identification of individuals beyond doubt” (§30).

Classifying information as ‘personal data’ is a stand alone exercise (does not depend on whether rights can be exercised)

The Irish DPC argued that one of the reasons why exam scripts are not personal data in this case is because the “purpose” of the right to access and the right to rectification of personal data precludes them to be “personal data” (§31). The DPC is concerned that Recital 41 of Directive 95/46 specifies that any person must be able to exercise the right of access to data relating to him which is being processed, in order to verify in particular the accuracy of the data and the lawfulness of the processing. “The examination candidate will seek the correction of incorrect examination answers”, the argument goes (§31).

AG Kokott rebuts this argument by acknowledging that the classification of information as personal data “cannot be dependent on whether there are specific provisions about access to this information” or on eventual problems with rectification of data (§34). “If those factors were regarded as determinative, certain personal data could be excluded from the entire protective system of the Data Protection Directive, even though the rules applicable in their place do not ensure equivalent protection but fragmentary protection at best” (§34)

Even if classification information as “personal data” would depend in any way on the purpose of the right to access, the AG makes it clear that this purpose is not strictly linked to rectification, blocking or erasure: “data subjects generally have a legitimate interest in finding out what information about them is processed by the controller” (§39). This finding is backed up by the use of “in particular” in Recital 41 of the Directive (§39).

The purpose of processing and… the passage of time, both relevant for obtaining access, rectification

After clarifying that it’s irrelevant what an individual wants to do with their data, once accessed (see also the summary below on the ‘abuse of rights’), AG Kokott explains that a legitimate interest in correcting an “exam script”-related data is conceivable.

She starts from the premise that “the accuracy and completeness of personal data pursuant to Article 6(1)(d) must be judged by reference to the purpose for which the data was collected and processed” (§35). The AG further identifies the purpose of an exam script as determining  “the knowledge and skills of the examination candidate at the time of the examination, which is revealed precisely by his examination performance and particularly by the errors in the examination” (§35). “The existence of errors in the solution does not therefore mean that the personal data incorporated in the script is inaccurate”, the AG concludes (§35).

Rectification could be achieved if, for instance, “the script of another examination candidate had been ascribed to the data subject, which could be shown by means of, inter alia, the handwriting, or if parts of the script had been lost” (§36).

The AG also found that the legitimate interest of the individual to have access to their own data is strengthened by the passage of time, to the extent that their recollection of the contents of their answer is likely to be considerably weaker a few years after the exam. This makes it possible that “a genuine need for information, for whatever reasons, will be reflected in a possible request for access. In addition, there is greater uncertainty with the passing of time — in particular, once any time limits for complaints and checks have expired — about whether the script is still being retained. In such circumstances the examination candidate must at least be able to find out whether his script is still being retained” (§41).

Is Mr Nowak abusing his right of access under data protection law?

AG Kokott recalls CJEU’s case-law on “abuse of rights” and the double test required by the Court to identify whether there had been any abuse of rights in a particular case (C-423/15 Kratzer and the case-law cited there at §38 to §40), which can be summed up to (§44):

i) has the purpose of the EU legislation in question been misused?

ii)  is the essential aim of the transaction to obtain an undue advantage?

The DPC submitted during the procedure that if exam scripts would be considered personal data, “a misuse of the aim of the Data Protection Directive would arise in so far as a right of access under data protection legislation would allow circumvention of the rules governing the examination procedure and objections to examination decisions” (§45).

The AG considers that “any alleged circumvention of the procedure for the examination and objections to the examination results via the right of access laid down by data protection legislation would have to be dealt with using the provisions of the Data Protection Directive” and she specifically refers to the restrictions to the right of access laid down in Article 13 of the Directive with the aim “to protect certain interests specified therein” (§46). She also points out that if restricting access to exam scripts can’t be circumscribed to those exceptions, than “it must be recognised that the legislature has given precedence to the data protection requirements which are anchored in fundamental rights over any other interests affected in a specific instance” (§47).

The AG also looks at the exceptions to the right of access under the GDPR and finds that it is more nuanced than the Directive in this regard. “First, under Article 15(4) of the regulation, the right to obtain a copy of personal data is not to adversely affect the rights and freedoms of others. Second, Article 23 of the regulation sets out the grounds for a restriction of data protection guarantees in slightly broader terms than Article 13 of the Directive, since, in particular, protection of other important objectives of general public interest of the Union or of a Member State pursuant to Article 23(1)(e) of the regulation may justify restrictions” (§48).

However, it seems that she doesn’t find the slight broadening of the scope of exemptions in the GDPR as justifying the idea of an abuse of right in this particular case.

The AG also argues that “on the other hand, the mere existence of other national legislation that also deals with access to examination scripts is not sufficient to allow the assumption that the purpose of the Directive is being misused” (§49). She concludes that even if such misuse would be conceivable, the second limb of the “abuse of rights” test would not be satisfied: “it is still not apparent where the undue advantage lies if an examination candidate were to obtain access to his script via his right of access. In particular, no abuse can be identified in the fact that someone obtains information via the right of access which he could not otherwise have obtained” (§50).

Examiner’s correction on the exam script are the examinee’s personal data and his/her own personal data at the same time

The AG looks into whether any corrections made by the examiner on the examination script are also personal data with respect to the examination candidate (a question raised by some of the parties), even though she considers that the answer will not impact the result of the main proceedings (§52, §53).

It is apparent that the facts of this case resemble the facts of YS and Others, where the Court refused extension of the right of access to the draft legal analysis of an asylum application on the grounds that that did not serve the purpose of the Data Protection Directive but would establish a right of access to administrative documents. The Court argued in YS that such an analysis “is not information relating to the applicant for a residence permit, but at most information about the assessment and application by the competent authority of the law to the applicant’s situation” (§59; see YS and Others, §40). The AG considers that only “at first glance” the cases are similar. But she doesn’t convincingly differentiate between the two cases in the arguments that follow.

However, she is convincing when explaining why the examiner’s corrections are “personal data”. AG Kokott explains that the purpose of the comments made by examiners on an exam script is “the evaluation of the examination performance and thus they relate indirectly to the examination candidate” (§61). It does not matter that the examiners don’t know the identity of the examination candidate who produced the script, as long as the candidate can be easily identified by the organisation holding the examination (§60 and §61).

The AG further adds that “comments on an examination script are typically inseparable from the script itself … because they would not have any informative value without it” (§62). And it is “precisely because of that close link between the examination script and any corrections made on it”, that “the latter also are personal data of the examination candidate pursuant to Article 2(a) of the Data Protection Directive” (§63).

In an important statement, the AG considers that “the possibility of circumventing the examination complaint procedure is not, by contrast, a reason for excluding the application of data protection legislation” (§64). “The fact that there may, at the same time, be additional legislation governing access to certain information is not capable of superseding data protection legislation. At most it would be admissible for the individuals concerned to be directed to the simultaneously existing rights of information, provided that these could be effectively claimed” (§64).

Finally, the AG points out “for the sake of completeness” that “corrections made by the examiner are, at the same time, his personal data”. AG Kokott sees the potential conflict between the right of the candidate to access their personal data and the right of the examiners to protect their personal data and underlines that the examiner’s rights “are an appropriate basis in principle for justifying restrictions to the right of access pursuant to Article 13(1)(g) of the Data Protection Directive if they outweigh the legitimate interests of the examination candidate” (§65).

The AG considers that “the definitive resolution to this potential conflict of interests is likely to be the destruction of the corrected script once it is no longer possible to carry out a subsequent check of the examination procedure because of the lapse of time” (§65).

An exam script forms part of a filing system

One last consideration made by AG Kokott is whether processing of an exam script would possibly fall outside the scope of Directive 95/46, considering that it does not seem to be processed using automated means (§66, §67).

The AG points out that the Directive also applies to personal data processed otherwise than by automated means as long as they form part of a “filing system”, even if this “filing system” is not electronically saved (§69).

“This concept covers any structured set of personal data which is accessible according to specific criteria. A physical set of examination scripts in paper form ordered alphabetically or according to other criteria meets those requirements” (§69), concludes the AG.

Conclusion. What will the Court say?

The Conclusions of AG Kokott in Nowak contain a thorough analysis, which brings several dimensions to the data protection debate that have been rarely considered by Courts – the self-standing importance of the right of access to one’s own data (beyond any ‘utilitarianism’ of needing it to obtain something else), the relevance of passage of time for the effectiveness of data protection rights, the limits of the critique that data protection rights may be used to achieve other purposes than data protection per se, the complexity of one data item being personal data of two different individuals (and the competing interests of those two individuals).

The Court will probably closely follow the Conclusions of the AG for most of the points she raised.

The only contentious point will be the classification of an examiner’s corrections as personal data of the examined candidate, because following the AG will mean that the Court would reverse its case-law from YS and Others.

If we apply the criteria developed by AG Kokott in this Opinion, it is quite clear that the analysis concerning YS and their request for asylum is personal data: the legal analysis is closely linked to the facts concerning YS and the other asylum applicants and the fact that there may be additional legislation governing access to certain information (administrative procedures in the case of YS) is not capable of superseding data protection legislation. Moreover, if we add to this the argument that access to one’s own personal data is valuable in itself and does not need to satisfy other purpose, reversing this case-law is even more likely.

The only arguable difference between this case and YS and Others is that, unlike what the AG found in §62 (“comments on an examination script are typically inseparable from the script itself… because they would not have any informative value without it”), it is conceivable that a legal analysis in general may have value by itself. However, a legal analysis of particular facts is void of value when applied to different individual facts. In this sense, a legal analysis can also be considered inseparable from the particular facts it assesses. What would be relevant in classifying it as personal data would then remain the identifiability of the person that the particularities refer to…

I was never convinced by the argumentation of the Court (or AG Sharpston for that matter) in YS and Others and I would welcome either reversing this case-law (which would be compatible with what I was expecting the outcome of YS to be) or having a more convincing argumentation as to why such an analysis/assessment of an identified person’s specific situation is not personal data. However, I am not getting my hopes high. As AG Kokott observed, the issue in the main proceedings can be solved without getting into this particular detail. In any case, I will be looking forward to this judgement.

(Summary and analysis by dr. Gabriela Zanfir-Fortuna)

 

Highlights of the draft LIBE report on the ePrivacy Reg

The draft Report prepared by MEP Marju Lauristin for the LIBE Committee containing amendments to the ePrivacy Regulation was published last week on the website of the European Parliament.

The MEP announced she will be presenting the Report to her colleagues in the LIBE Committee on 21 June. The draft Report will need to be adopted first by the LIBE Committee and at a later stage by the Plenary of the European Parliament. The Parliament will then sit in the trilogue together with the European Commission and the Council (once it will also adopt an amended text), finding the compromise among the three versions of the text.

Overall, the proposed amendments strengthen privacy protections for individuals. The big debate of whether there should be an additional exemption to confidentiality of communications based on the legitimate interest of service providers and other parties to have access to electronic communications data was solved in the sense that no such exemption was proposed (following calls in this sense by the Article 29 Working Party, the European Data Protection Supervisor and a team of independent academics). The draft report also contains strong wording to support end-to-end encryption, as well as support for Do-Not-Track technology and a new definition of the principle of confidentiality of communications in the age of the Internet of Things.

Without pretending this is a comprehensive analysis, here are 20 points that caught my eye after a first reading of the amendments (added text is bolded and italicised):

1) Clarity regarding what legitimate grounds for processing prevail if both the GDPR and the ePrivacy Reg could apply to a processing operation: those of the ePrivacy Reg. (“Processing of electronic communications data by providers of electronic communications services should only be permitted in accordance with, and on a legal ground specifically provided for under, this Regulation” – Recital 5). The amendment to Recital 5 further clarifies the relationship between the GDPR and the ePrivacy Reg, specifying that the ePrivacy Reg “aims to provide additional and complementary safeguards taking into account the need for additional protection as regards the confidentiality of communications”.

2) The regulation should be applicable not only to information “related to” the terminal equipment of end-users, but also to information “processed by” it. (“…and to information related to or processed by the terminal equipment of end-users” – Article 2; see also the text proposed for Article 3(1)(c)). This clarifies the material scope of the Regulation, leaving less room for interpretation of what “information related to” means.

3) The link to the definitions of the Electronic Communications Code is removed. References to those definitions are replaced by self-standing definitions for “electronic communications network”, “electronic communications service”, “interpersonal communications service”, “number-based interpersonal communications service”, “number -independent interpersonal communications service”, “end-user”. For instance, the new definition proposed for “electronic communications service” is “a service provided via electronic communications networks, whether for remuneration or not, which encompasses one or more of the following: an ‘internet access service’ as defined in Article 2(2) or Regulation (EU) 2015/2120; an interpersonal communications service; a service consisting wholly or mainly in the conveyance of the signals, such as a transmission service used for the provision of a machine-to-machine service and for broadcasting, but excludes information conveyed as part of a broadcasting service to the public over an electronic communications network or service except to the extent that the information can be related to the identifiable subscriber or user receiving the information” (Amendment 49; my underline).

4) Limitation of the personal scope of key provisions of the Regulation to natural persons. The draft report proposes two definitions to delineate the personal scope of the Regulation – “end-users” and “users”. While an “end-user” is defined as “a legal entity or a natural person using or requesting a publicly available electronic communications service“, a “user” is defined as “any natural person using a publicly available electronic communications service (…)“. Key provisions of the Regulation are only applicable to users, and especially the proposed principle of confidentiality of communications. (See Amendments 58 and 59). This proposal may unnecessarily limit the scope of application of the right to respect for private life, which, as opposed to the right to the protection of personal data, is theoretically (the CJEU did not yet explicitly state this) recognised as also protecting the privacy and confidentiality of communications of legal persons (through correspondence with Article 8 ECHR and how it has been interpreted by the European Court of Human Rights; for an analysis, see p. 17 and following HERE). The current ePrivacy Directive equally protects the confidentiality of communications of both natural and legal persons.

5) Enhanced definition of “electronic communications metadata”, to also include “data broadcasted or emitted by the terminal equipment to identify users’ communications and/or the terminal equipment or its location and enable it to connect to a network or to another device“.

6) Enhanced definition of “direct marketing”, to also include advertising in video format, in addition to the written and oral formats, and advertising served or presented to persons, not only “sent”. Could this mean that the definition of direct marketing will cover street advertising panels reacting to the passer-by? Possibly.

7) Extension of the principle of confidentiality of communications to machine-to-machine communications. A new paragraph is added to Article 5 (Amendment 59) “Confidentiality of electronic communications shall also include terminal equipment and machine-to-machine communications when related to a user”.

8) “Permitted” processing of electronic communications data is replaced by “lawful” processing. This change of wording de-emphasises the character of “exemptions to a principle” that the permitted processing had relative to the general principle of confidentiality. This may have consequences when Courts will interpret the law.

9) While proposed wording for the existing lawful grounds for processing is stricter (processing is allowed “only if”; necessity is replaced with “technically strict necessity”), there are additional grounds for processing added (See Amendments 64 to 66, to Article 6; see also Amendments 77, 79, 80 to Article 8).

10) A “household exemption” is introduced, similar to the one provided for by the GDPR, enhanced with a “work purposes exemption”: “For the provision of a service explicitly requested by a user of an electronic communications service for their purely individual or individual work related usage (…)“. In such circumstances, electronic communications data may be processed “without the consent of all users”, but “only where such requested processing produces effects solely in relation to the user who requested the service  and “does not adversely affect the fundamental rights of another user or users“. This exemption raises some questions and the first one is: does anyone use an electronic communications service for purposes other than “purely individual” or “work related” purposes? If you think so, leave a comment with examples. Another question is what does “without the consent of all users” mean (See Amendment 71, to Article 6).

11) An exception for tracking employees is included in the proposal. The collection of information from user’s terminal equipment (for instance, via cookies) would be permitted “if it is necessary in the context of employment relationships“, but only to the extent the employee is using equipment made available by the employer and to the extent this monitoring “is strictly necessary for the functioning of the equipment by the employee” (see Amendment 82). It remains to be seen what “functioning of the equipment by the employee” means. This exemption seems to have the same effect as the one in Article 8(1)(a), which allows such collection of information if “it is strictly technically necessary for the sole purpose of carrying out the transmission of an electronic communication over an electronic communications network”. On another hand, it should be kept in mind that the ePrivacy rules are not intended to apply to closed groups of end-users, such as corporate intranet networks, access to which is limited to members of an organisation (see Recital 13 and Amendment 11).

12) Consent for collecting information from terminal equipment “shall not be mandatory to access the service”. This means, for instance, that even if users do not consent to placing cookies tracking their activity online, they should still be allowed to access the service they are requesting. While this would be considered a consequence of “freely given” consent, enshrining this wording in a legal provision certainly leaves no room for interpretation (see Amendment 78 to Article 8). Moreover, this exception for collecting information is strengthened by a rule introduced as a separate paragraph of Article 8, according to which “No user shall be denied access to any information society service or functionality, regardless of whether this service is remunerated or not, on grounds that he or she has not given his or her consent under Article 8(1)(b) to the processing of personal information and/or the use of storage capabilities of his or her terminal equipment that is not necessary for the provision of that service or functionality.” (see Amendment 83). Such wording would probably put to rest concerns that personal data would be considered as “counter-performance” (equivalent to money) for services.

13) All further use of electronic communications data collected under ePrivacy rules is prohibited. A new paragraph inserted in Article 6 simply states that “Neither providers of electronic communications services, nor any other party, shall further process electronic communications data collected on the basis of this Regulation” (see Amendment 72).

14) Wi-fi tracking and similar practices involving collection of information emitted by terminal equipment would only be possible with the informed consent of the user or if the data are anonymised and the risks are adequately mitigated (a third exception is, of course, when accessing such data is being done for the purposes of establishing a connection). This is a significant change compared to Commission’s text, which allowed such tracking in principle, provided the user is informed and is given the possibility to opt-out (“stop or minimise the collection”) (see Amendments 85, 86). The draft report also proposes a new paragraph to Article 8 containing measures to mitigate risks, including only collecting data for the purpose of “statistical counting”, anonymisation or deletion of data “immediately after the purpose is fulfilled”, and effective opt-out possibilities.

15) Significantly stronger obligations for privacy by default are proposed, with a clear preference for Do-Not-Track mechanism. Article 10 is enhanced so that all software placed on the market must “by default, offer privacy protective settings to prevent other parties from storing information on the terminal equipment of a user and from processing information already stored on that equipment” (see Amendment 95). Opt-outs shall be available upon installation (Amendment 96). What is remarkable is a new obligation that “the settings shall include a signal which is sent to the other parties to inform them about the user’s privacy settings. These settings shall be binding on, and enforceable against, any other party” (see Amendment 99). The rapporteur explains at the end of the Report that “the settings should allow for granulation of consent by the user, taking into account the functionality of cookies and tracking techniques and DNTs should send signals to the other parties informing them of the user’s privacy settings. Compliance with these settings should be legally binding and enforceable against all other parties”.

16) A national Do Not Call register is proposed for opting out of unsolicited voice-to-voice marketing calls (see Amendment 111).

17) End-to-end encryption proposed as security default measure for ensuring confidentiality of communications. Additionally, strong wording is included to prevent Member States from introducing measures amounting to backdoors. Under the title of “integrity of the communications and information about security risks”, Article 17 is amended to include a newly introduced paragraph that states: “The providers of electronic communications services shall ensure that there is sufficient protection in place against unauthorised access or alterations to the electronic communications data, and that the confidentiality and safety of the transmission are also guaranteed by the nature of the means of transmission used or by state-of-the-art end-to-end encryption of the electronic communications data. Furthermore, when encryption of electronic communications data is used, decryption, reverse engineering or monitoring of such communications shall be prohibited. Member States shall not impose any obligations on electronic communications service providers that would result in the weakening of the security and encryption of their networks and services (my highlight) (see Amendment 116).

18) The possibility of class actions for infringement of the ePrivacy reg is introduced. End-users would have “the right to mandate a not-for-profit body, organisation or association” to lodge complaints or to seek judicial remedies on their behalf (see Amendments 125 and 126).

19) Infringement of obligations covered by Article 8 (cookies, wi-fi tracking) would be sanctioned with the first tier of fines (the highest ones – up to 20 mill. EUR or 4% of global annual turnover), which is not the case in the Commission’s proposal (see Amendment 131).

As a bonus, here’s the 20th highlight:

20) Echoing the debate over data analytics using pshycographic measurements to influence elections, the report amends an important recital (Recital 20) to refer to the fact that information on terminal equipments may reveal “very sensitive data”, including “details of the behaviour, psychological features, emotional condition and political and social preferences of an individual“. Among other reasons, this justifies the principle that any interference with the user’s terminal equipment should be allowed only with the user’s consent and for specific and transparent purposes (see Amendment 20; also, watch this video from last week’s Digital Assembly in Malta, where around min. 6 the Rapporteur talks about this and points out that “without privacy there will be no democracy”).

 

Read more:

Other analyses of the LIBE draft report: HERE and HERE.

Overview of the ePrivacy initial proposal by the Commission: HERE.

***
Enjoy what you are reading? Consider supporting pdpEcho