Expert data protection consultants, GDPRXpert, examine the recent Google Right to be Forgotten ruling (Case C-507/17).
The case stemmed from an initial request for a preliminary ruling by the French data protection regulator. (Request for a preliminary ruling from the Conseil d’État (France) lodged on 21 August 2017 — Google Inc v Commission nationale de l’informatique et des libertés (CNIL))
The implications of the decision in the recent ‘Right To Be Forgotten’ case are likely to be far-reaching and controversial. Before any understanding of these implications can be grasped or a sober and objective assessment made, some knowledge of the context and background is necessary. What EU legislation, and in particular the GDPR, sets out about the right will act as an additional tool in assessing the rationality of the conclusions reached in the case. In the light of those conclusions, where does the Right To Be Forgotten (RTBF) now stand? A more insightful question is where should the right now stand? Not everyone will agree on this. Some views may mirror sentiments surrounding the GDPR itself that qualified data protection consultants, such as GDPRXpert, have commented on previously.
Background and Context to the Case.
It has long been recognised that the RTBF exists under EU law. This has been evident since the 1995 Data Protection Directive (‘The Directive’) and from previous case law. More recently, Art. 17 GDPR has set it out clearly. What is also established is that the right is a qualified right and not an absolute right. A normal consequence is the balancing of the right against other rights; against those rights that may be competing in the same sphere. The European Court of Justice (CJEU) in a seminal 2014 case, widely referenced as Google Spain, held that Google was a data controller in its processing of personal data relating to the operation of a search engine.
Google Spain Case C-131/12 (13 May 2014).
In Google Spain a lawyer (the applicant) was objecting to the fact that anyone who searched his name on the Google search engine would obtain links to an article in a newspaper. That article reported the details of a court attachment order against the applicant for the recovery of social security debts. What is noteworthy is that the case pre-dates the GDPR. It was a case that initially fell for consideration within the ambit of ‘The Directive’, and specifically Articles 12(b) and 14 (a). Mr. Gonzalez, the lawyer applicant, was seeking to enforce his right of objection. He felt that the material reported in the newspaper article was creating negative publicity, and reflected badly on him in his professional capacity. Some events reported in the article concerning M. Gonzalez had taken place 16 years previously.
Google had no control over the material in the newspaper report, yet it was directing the purposes and means of indexing. Anything that showed up when the applicant’s name was entered in the search box was the result of Google indexing. Material on third party websites is not controlled by Google. In this case, the information on M.r Gonzalez is still available in the newspaper publication and can be accessed without the help of Google. Nevertheless, Google was ordered by the Court to comply with the request for erasure.
The Court held that where a person’s name was used in the search, the search engine is obliged to remove from the list of results any links to web pages published by 3rd parties, and containing information concerning that person. This stands even when the publication of the information on those pages is lawful. On the facts of the case, the Court held that individuals may request search engines to remove links to inadequate, irrelevant or excessive content relating to them online. In this particular case, the interference with a person’s right to data protection could not be justified merely by the economic interest of the search engine.
After Google Spain
Defining the exact parameters and contours of the judgment have stoked uncertainty and fostered controversy for years. As soon as the ruling was announced Google introduced new internal procedures. These procedures were to facilitate changes that the ruling demanded, and enable it to assess requests for erasure. Every request had to be assessed on its own merits to apply the criteria mentioned in EU law and the European Court’s judgment. These criteria relate to the accuracy, adequacy, relevance – including time passed – and proportionality of the links, in relation to the purposes of the data processing (paragraph 93 of the ruling).
Following a successful request, the principal new procedure known as ‘geo-blocking’ will come to the fore. Geo-blocking, as the word suggests, operates to block access to the information from a searcher’s domain ( More on this later). After the Google Spain case and up to late 2018, Google had received over 700,000 requests for erasure. Over 40% of these were categorised as well-founded, and consequently, the related search results were de-listed. One pre-requisite is that the search is based on the person’s name. Other searches, not based on the person’s name, can still lead to the information in the third party link or the link can be accessed directly. A person would have to put in a request with the data controller for the third party website in order to secure erasure of personal data on that website. We emphasise again the nature of the right; qualified and limited.
Google and the French Regulator
Google commenced the process of de-listing results. However, the structure and methodology of the de-listing did not meet with the full approval of the French regulator. There was a reason for this. When Google initiated the new de-listing procedure they only de-listed in relation to EU domains such as google.es, google. fr, google. de, and so on. Domains outside the EU, such as google.com, were unaffected, resulting in the information remaining conveniently available. In 2016 Google had introduced the geo-blocking feature that prevented European users from viewing the de-listed results, but it resisted censoring results for people in other parts of the world. From the viewpoint of the French data protection regulator, Commission Nationale de l’Informatique et des Libertés (‘CNIL’), this was unsatisfactory.
What CNIL Wanted
CNIL argued that by only de-listing the EU domains, Google was not giving data subjects’ personal data the protection that the judgment in the case had envisaged. It followed from this, that to ensure full protection of the personal data of data subjects, erasure of the personal data should happen worldwide. If this was not to happen, the certain consequence was going to be access to the personal data via other domains. Other methods, such as circumvention through the use of a Virtual Private Network (VPN) could also be used.
For Google, de-listing worldwide was a disproportionate measure and placed an over onerous burden on the operation of its search engine. (GDPRXpert recently looked at disproportionate measures in the context of the visitor books at OPW sites). Applying the RTBF ruling in jurisdictions that had strong constitutional protection for freedom of expression and free speech, such as the U.S., was judged as problematic. Google appealed the decision. Principles of territorial jurisdiction and global data flows that seem incompatible with each other must now undergo more judicial scrutiny.
Article 17 GDPR
Google v CNIL was always going to be a complicated case as the array of issues involved was open to differing interpretations. To further complicate the issues, the introduction of the GDPR in May 2018 effectively repealed the old Directive. Google Spain considered Article 14 of Directive 95/46, but Article 17 GDPR now broadens out the circumstances where the right to erasure will apply. Consequently, there was an inevitable focus on interpreting its application and relevance to the facts in this particular case.
This ‘new right’ to erasure (‘right to be forgotten) is set out under Art. 17 of the GDPR. The grounds for erasure (Art. 17 (1) are enumerated, and the controller is obliged to erase personal data without undue delay where those grounds apply. Primary grounds for erasure include (but are not limited to): the data are no longer needed for their original purpose; consent has been withdrawn and there are no other legal grounds for the processing; the processing was unlawful in the first place and; erasure is required under EU or Member State law. Grounds for refusing to erase the personal data (Art.17 (2) are also set out, but these are very limited, and only will apply ‘where the processing is necessary’ under those stated grounds.
That word ‘necessary’ crops up again and is open to interpretation. Certified GDPR and data protection advisers, GDPRXpert, have explained in previous blogs how the word ‘necessary’, in the context of the GDPR, means more than ‘useful’ or ‘convenient’. We saw previously how much of the debate surrounding the Public Services Card shifted and began to examine specific aspects of the card. For example, when exactly was processing deemed ‘necessary’ in relation to a stated particular purpose?
The RTBF is simultaneously more ambiguous and ambitious than other rights and is likely to be the subject of more legal challenges. Different competing rights, ones that require balancing against one another, will lead to most of the confrontations. What is most likely to be the battleground will be the intersection of the RTBF with the right to freedom of expression and information. Strategists of the opposing factions may be forced to look to the degree of erasure or whether any item of data can ever be truly and permanently erased. One thing is certain: nowhere in Art 17 GDPR does it mention de-listing information on a worldwide basis. None of us need to be a courtroom advocate, but the foregoing should provide us with sharper interpretive tools to assist in our own analysis of the final decision in Google v CNIL .
Google v CNIL
At the core of the case, there are two differing perspectives. Google is focused on broader economic and societal implications. CNIL is looking through the prism of individual data protection rights. Four questions were submitted to the Court for a preliminary ruling by the French Conseil d’état :
First, whether the de-referencing following a successful request for erasure must be deployed in relation to all domain names irrespective of the location from where the search based on the requester’s name is initiated, even if that occurs outside of the EU;
Second, if the first question is answered negatively, whether the RTBF must only be implemented in relation to the domain name of the Member State from which the search is deemed to have been operated or, third, whether this must be done in relation to the domain names corresponding to all Member States;
Fourth, whether the RTBF implies an obligation for search engine operators to use geo-blocking where a user is based in (i) the Member State from which the request for erasure emanated or (ii) the territory of the EU searchers non-EU domains.
The Opinion in Google v CNIL
A hint of where the case was going became clearer with the preliminary opinion of the Advocate General of the Court (CJEU) on 10 January 2019. With the opinion there came a re-statement of the order of rights. What was emphasised once more was that the RTBF involved a balancing exercise against other rights, and most especially against the right to freedom of expression. The Advocate General concluded that where a claim for de-referencing has been successful, the search engine operator should only be required to effect de-referencing within the EU. This was a non-binding ruling. In most cases, the full court at the Grand Chamber follows the opinion of the Advocate General.
The Grand Chamber Decision in Case-C 507/17
The Court held that “The operator of a search engine is not required to carry out a de-referencing on all versions of its search engine. It is, however, required to carry out that de-referencing on the versions corresponding to all the Member States and to put in place measures discouraging internet users from gaining access, from one of the Member States, to the links in question which appear on versions of that search engine outside the EU.”
It went on to cite Google v Spain and stated that the Court had already held, “ that the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful”.
Under the old Directive, and more recently under the GDPR, Google Inc’s operations fell within the scope of EU legislation on data protection. Global de-referencing would meet the objective of protection of EU law in full, but there were other considerations. Numerous third States do not recognise the right to dereferencing or have a different approach to that right. The Court added that the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality.
Any balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world. There was no evidence, in legal texts or anywhere else, that the EU legislature had struck such a balance. Neither was there any evidence that it had chosen to confer a scope on the rights of individuals going beyond the territory of the Member States. In addition, there was no evidence it would have intended to place a de-listing burden on an operator, such as Google, which concerns the national versions of its search engine, as distinct from those of Member States.
EU law does not provide for cooperation instruments and mechanisms as regards the scope of a de-referencing outside the EU. “Thus, the Court concludes that, currently, there is no obligation under EU law, for a search engine operator who grants a request for de-referencing made by a data subject, as the case may be, following an injunction from a supervisory or judicial authority of a Member State, to carry out such a de-referencing on all the versions of its search engine.” Nevertheless, EU law does require a search engine operator to carry out such a de-referencing on the versions of its search engine corresponding to all the Member States.
A search engine must take sufficiently effective measures to ensure the effective protection of the data subject’s fundamental rights. What this means in practice is that any de-listing or de-referencing, “must, if necessary, be accompanied by measures which effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, through a version of that search engine outside the EU, to the links which are the subject of the request for de-referencing”.
It will be for the national court to ascertain whether the measures put in place by Google Inc. meet those requirements. Lastly, the Court points out that, while EU law does not currently require a de-referencing to be carried out on all versions of the search engine, it also does not prohibit such a practice. Just as in Google Spain, it was acknowledged that removing irrelevant and outdated links is not tantamount to deleting content. The data will still be accessible, but no longer ubiquitous.
Patrick Rowland, GDPRXpert.
We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.
For more details visit www.gdprxpert.ie