The DPC is not infallible.

The  DPC is not infallible, and so it is wise to remember that data controllers have legal rights.

There is no doubt that much time has been spent in the media and on this forum in debating aspects of the Public Services Card. Data protection consultants GDPRXpert first reported on this in a blog way back in 15 Feb 2019.  We rightly predicted the main conclusions resulting from the recent investigation by the office of the DPC into the legitimacy of the Public Services Card. Some of the concerns that the DPC was likely to focus on in the continuing contentious debate were highlighted.

At that time many feared the PSC represented the introduction of a national identity card by stealth. GDPRXpert wrote at the time that “The government vehemently denied this, and different Ministers for Social Protection (Burton, Varadkar, and Doherty) regularly appeared in the media to explain and defend the purposes behind its introduction and certify its bona fides. It was just a convenient card with no other purposes than to cut down on benefit fraud and streamline operations. Everything now should work more cost-effectively and taxpayer money would be saved.” There is still little impediment standing in the way of its use as a de facto national identity card (See Adrian Weckler, “National ID Card Isn’t Dead” SINDO, Aug.18, 2019).

There was a follow up on PSC and biometric data on 21 Feb.   On 22 Aug. data protection consultants, GDPRXpert, discussed the DPC findings into the investigation of the PSC. A report was issued and recommendations were made to the Govt.

Three central issues were to the fore in the report:

The lack of lawful bases for processing personal data, apart from processing by the DEASP;

Lack of transparency;  ( in terms of what personal data it processes in the context of SAFE 2/PSC, for example, how that data is updated and shared with other public sector bodies for the purposes of decision-making) and

Retention of data beyond what is necessary. (In particular, the retention of supporting documentation that was demanded in support of an application was excessive.)

Data protection consultants GDPRXpert have the DEASP link to the report available now.

Minister Regina Doherty: ‘We don’t agree with any of the eight findings and we have written to the commission to confirm that.’ Photograph: Dara Mac Donaill / The Irish Times

 

In total, the DPC made 8 adverse findings in relation to the card’s introduction and operation. The government disagrees with each of these findings according to Minister Doherty. When publishing the findings of her report, Dixon said the Department had 21 days to provide an update on how it was implementing the finding that it was no longer lawful to require a PSC for services other than welfare. On Sept 5 the 21 days had expired.

Minister for Employment Affairs and Social Protection Regina Doherty has said her department will not comply with any of the directions from the Data Protection Commissioner (DPC) on its Public Services Card project. “We won’t be complying with any of the instructions with regard to the findings or the instructions in the letter,” the Minister told RTÉ Radio.(https://www.irishtimes.com/news/ireland/irish-news/government-will-not-comply-with-findings-on-public-services-card-1.4021397)

The Government believes that it would be potentially unlawful to withdraw or modify the PSC. A statement confirmed that its intention is to continue to operate the PSC and the Safe 2 identity authentication process on which it is based. Despite the controversies, the PSC remains popular, with 96 percent of those surveyed saying they were either very satisfied or fairly satisfied with the process. (Irish Times, Sept 17, 2019)

The reactions to the Government ‘daring’ to challenge the findings of the DPC have been surprising. Most data protection consultants would agree that the PSC has a lawful basis, but only in relation to its use for welfare related services through the DEASP. We have previously highlighted as unlawful any demand for the card in relation to other services unrelated to DEASP, such as passport, driving licence and more. Agree or disagree, the Government, just like a private citizen, has the right to appeal findings or a decision of the DPC. To deny or question this is to deny or question a basic tenet of the rule of law: access to justice and judicial review. GDPR will always be interpreted in light of the European  Charter Of Fundamental Rights and in this instance Art. 47 is the most applicable. The independence of the DPC does not mean it “cannot be subject to control or monitoring mechanisms…or to judicial review” ( Recital 118, GDPR).

Some of the groups foremost in the criticism are groups whose mission embodies supporting the rule of law. e.g. The Irish Council for Civil Liberties, but the ICCL has been opposed to the PSC from the start. Its opposition to it has been based more on ideology than on law. “This card unfairly targets economically marginalised people who depend on the State for their welfare payments. It also works in a gendered way, being a requirement for mothers collecting child benefit. Though the DPC report did not focus on these issues, ICCL believes that the structural inequality inherent in the card may well render it illegal”.  (See ICCL website)

The DPC did not focus on ‘these issues’ for good reason: they are completely tangential. Opposition to the proposed body cameras, to be used by the Gardai has also been voiced by the council. Again, this seems more ideologically driven, than legally focused. In a recent journal.ie poll, over 90% of respondents had no privacy or data protection concerns about the use of body cameras by Gardai.

Here come the legal bills!

 

The DPC has never claimed to be infallible. Previous cases such as the   Shatter case and the original Schrems case prove it is not.  Indeed, neither has any court claimed to be infallible. A superior court overturning a lower court decision is not out of the ordinary.  It is simplistic to say the lower court ‘got it wrong’ (but courts do ‘err in law’). In the majority of cases, there is at least some substantive legal validity in differing court opinions. Higher courts may overrule lower courts, but when appeals are all exhausted it has to come down to the decision of that final court. Ideally, the final decision is one that meets the highest threshold of justice and equity. Justice must be done and seen to be done.

In the context of the Government appealing the findings of the DPC, there may have been a rush to comment. At this stage, the DPC has not yet made an Enforcement Order. The chief civil servant in charge of the controversial Public Services Card project has said that his department would not be challenging findings of illegality against the card unless it was “absolutely sure that a challenge was not only appropriate but necessary”. Appearing before the Dáil Public Accounts Committee – ostensibly to discuss his department’s most recently published accounts – secretary-general of the Department of Employment Affairs and Social Protection John McKeon wouldn’t be drawn on whether or not that challenge would serve to “undermine” the office of the Data Protection Commissioner, which operates as an independent state regulator.

Again, just because the DPC operates as an independent state regulator does not mean its decisions are above legal challenge by the Government. We can question the basis of any appeal if and when it arises, but we can not question the right to appeal itself. Graham Doyle, the DPC’s Head of Communications, told TheJournal.ie that the  Commission has declined the Department of Employment Affairs and Social Protection’s (DEASP) request for a meeting and plans to proceed with enforcement action. “I can confirm that we have this evening responded to the Department and have declined their request for a meeting. (https://www.thejournal.ie/data-protection-commission-psc-4797429-Sep2019/ )

We await a decision on any enforcement action to be taken by the DPC. In an upcoming blog, we will look at the architecture of enforcement actions under the GDPR and Data Protection Act 2018, with an in-depth look at the appeal processes available.

Patrick Rowland, GDPRXpert.

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit www.gdprxpert.ie

Long Awaited Ruling on The Right to be Forgotten.

 

Expert data protection consultants, GDPRXpert, examine the recent Google Right to be Forgotten ruling (Case C-507/17).

The case stemmed from an initial request for a preliminary ruling by the French data protection regulator. (Request for a preliminary ruling from the Conseil d’État (France) lodged on 21 August 2017 — Google Inc v Commission nationale de l’informatique et des libertés (CNIL))

 

The implications of the decision in the recent ‘Right To Be Forgotten’ case are likely to be far-reaching and controversial. Before any understanding of these implications can be grasped or a sober and objective assessment made, some knowledge of the context and background is necessary. What EU legislation, and in particular the GDPR,  sets out about the right will act as an additional tool in assessing the rationality of the conclusions reached in the case.  In the light of those conclusions, where does the Right To Be Forgotten (RTBF) now stand? A more insightful question is where should the right now stand? Not everyone will agree on this. Some views may mirror sentiments surrounding the GDPR itself that qualified data protection consultants, such as GDPRXpert, have commented on previously.  

Background and Context to the Case.

It has long been recognised that the RTBF exists under EU law. This has been evident since the 1995 Data Protection Directive (‘The Directive’) and from previous case law. More recently, Art. 17 GDPR  has set it out clearly. What is also established is that the right is a qualified right and not an absolute right. A normal consequence is the balancing of the right against other rights;  against those rights that may be competing in the same sphere. The European Court of Justice (CJEU) in a seminal 2014 case, widely referenced as Google Spain, held that Google was a data controller in its processing of personal data relating to the operation of a search engine.

 

 

Google Spain Case C-131/12 (13 May 2014).

In Google Spain a lawyer (the applicant) was objecting to the fact that anyone who searched his name on the Google search engine would obtain links to an article in a newspaper. That article reported the details of a court attachment order against the applicant for the recovery of social security debts. What is noteworthy is that the case pre-dates the  GDPR. It was a case that initially fell for consideration within the ambit of ‘The Directive’, and specifically Articles 12(b) and 14 (a). Mr. Gonzalez, the lawyer applicant, was seeking to enforce his right of objection. He felt that the material reported in the newspaper article was creating negative publicity, and reflected badly on him in his professional capacity. Some events reported in the article concerning M. Gonzalez had taken place 16 years previously.

Google had no control over the material in the newspaper report, yet it was directing the purposes and means of indexing. Anything that showed up when the applicant’s name was entered in the search box was the result of Google indexing.  Material on third party websites is not controlled by Google. In this case, the information on M.r Gonzalez is still available in the newspaper publication and can be accessed without the help of Google. Nevertheless, Google was ordered by the Court to comply with the request for erasure.

Data protection rights v Freedom of expression and information

The Court held that where a person’s name was used in the search, the search engine is obliged to remove from the list of results any links to web pages published by 3rd parties, and containing information concerning that person. This stands even when the publication of the information on those pages is lawful. On the facts of the case, the Court held that individuals may request search engines to remove links to inadequate, irrelevant or excessive content relating to them online. In this particular case, the interference with a person’s right to data protection could not be justified merely by the economic interest of the search engine.

After Google Spain

Defining the exact parameters and contours of the judgment have stoked uncertainty and fostered controversy for years. As soon as the ruling was announced Google introduced new internal procedures. These procedures were to facilitate changes that the ruling demanded, and enable it to assess requests for erasure. Every request had to be assessed on its own merits to apply the criteria mentioned in EU law and the European Court’s judgment. These criteria relate to the accuracy, adequacy, relevance – including time passed – and proportionality of the links, in relation to the purposes of the data processing (paragraph 93 of the ruling).

Where is that information?

 

Following a successful request, the principal new procedure known as ‘geo-blocking’ will come to the fore.  Geo-blocking, as the word suggests, operates to block access to the information from a searcher’s domain ( More on this later).  After the Google Spain case and up to late 2018, Google had received over 700,000 requests for erasure.  Over 40% of these were categorised as well-founded, and consequently, the related search results were de-listed. One pre-requisite is that the search is based on the person’s name. Other searches, not based on the person’s name, can still lead to the information in the third party link or the link can be accessed directly.  A person would have to put in a request with the data controller for the third party website in order to secure erasure of personal data on that website.  We emphasise again the nature of the right; qualified and limited.

Google and the French Regulator

Google commenced the process of de-listing results. However, the structure and methodology of the de-listing did not meet with the full approval of the French regulator. There was a reason for this. When Google initiated the new de-listing procedure they only de-listed in relation to EU domains such as google.es, google. fr, google. de, and so on. Domains outside the EU, such as google.com, were unaffected, resulting in the information remaining conveniently available. In 2016 Google had introduced the geo-blocking feature that prevented European users from viewing the de-listed results, but it resisted censoring results for people in other parts of the world.  From the viewpoint of the French data protection regulator, Commission Nationale de l’Informatique et des Libertés (‘CNIL’), this was unsatisfactory.

 

What CNIL Wanted

CNIL argued that by only de-listing the EU domains, Google was not giving data subjects’ personal data the protection that the judgment in the case had envisaged. It followed from this, that to ensure full protection of the personal data of data subjects, erasure of the personal data should happen worldwide. If this was not to happen, the certain consequence was going to be access to the personal data via other domains. Other methods, such as circumvention through the use of a Virtual Private Network (VPN) could also be used.

For Google, de-listing worldwide was a disproportionate measure and placed an over onerous burden on the operation of its search engine. (GDPRXpert recently looked at disproportionate measures in the context of the visitor books at OPW sites).  Applying the RTBF ruling in jurisdictions that had strong constitutional protection for freedom of expression and free speech, such as the U.S., was judged as problematic. Google appealed the decision. Principles of territorial jurisdiction and global data flows that seem incompatible with each other must now undergo more judicial scrutiny.

Article 17 GDPR

Google v CNIL was always going to be a complicated case as the array of issues involved was open to differing interpretations. To further complicate the issues, the introduction of the GDPR in May 2018 effectively repealed the old Directive.   Google Spain considered Article 14 of Directive 95/46, but Article 17 GDPR now broadens out the circumstances where the right to erasure will apply. Consequently, there was an inevitable focus on interpreting its application and relevance to the facts in this particular case.

This ‘new right’ to erasure (‘right to be forgotten) is set out under Art. 17 of the GDPR. The grounds for erasure (Art. 17 (1) are enumerated, and the controller is obliged to erase personal data without undue delay where those grounds apply. Primary grounds for erasure include (but are not limited to): the data are no longer needed for their original purpose; consent has been withdrawn and there are no other legal grounds for the processing; the processing was unlawful in the first place and; erasure is required under EU or Member State law. Grounds for refusing to erase the personal data (Art.17 (2)   are also set out, but these are very limited, and only will apply ‘where the processing is necessary’ under those stated grounds.

That word ‘necessary’ crops up again and is open to interpretation. Certified GDPR and data protection advisers, GDPRXpert, have explained in previous blogs how the word ‘necessary’, in the context of the GDPR, means more than ‘useful’ or ‘convenient’.  We saw previously how much of the debate surrounding the Public Services Card shifted and began to examine specific aspects of the card. For example, when exactly was processing deemed ‘necessary’ in relation to a stated particular purpose?

The RTBF is simultaneously more ambiguous and ambitious than other rights and is likely to be the subject of more legal challenges. Different competing rights, ones that require balancing against one another, will lead to most of the confrontations. What is most likely to be the battleground will be the intersection of the RTBF with the right to freedom of expression and information. Strategists of the opposing factions may be forced to look to the degree of erasure or whether any item of data can ever be truly and permanently erased. One thing is certain: nowhere in Art 17 GDPR does it mention de-listing information on a worldwide basis.  None of us need to be a courtroom advocate, but the foregoing should provide us with sharper interpretive tools to assist in our own analysis of the final decision in Google v CNIL .

 

Google v CNIL

At the core of the case, there are two differing perspectives. Google is focused on broader economic and societal implications. CNIL is looking through the prism of individual data protection rights. Four questions were submitted to the Court for a preliminary ruling by the French Conseil d’état :

First, whether the de-referencing following a successful request for erasure must be deployed in relation to all domain names irrespective of the location from where the search based on the requester’s name is initiated, even if that occurs outside of the EU;

Second, if the first question is answered negatively, whether the RTBF must only be implemented in relation to the domain name of the Member State from which the search is deemed to have been operated or, third, whether this must be done in relation to the domain names corresponding to all Member States;

Fourth, whether the RTBF implies an obligation for search engine operators to use geo-blocking where a user is  based in (i) the Member State from which the request for erasure emanated or (ii) the territory of the EU searchers non-EU domains.

Expert data protection consultants GDPRXpert have accessed some quality articles on the RTBF for this blog, such as, ‘Google v CNIL: Defining the Territorial Scope of European Data Protection Law’.

The Opinion in  Google v CNIL

A hint of where the case was going became clearer with the preliminary opinion of the Advocate General of the Court (CJEU) on 10 January 2019. With the opinion there came a re-statement of the order of rights. What was emphasised once more was that the RTBF involved a balancing exercise against other rights, and most especially against the right to freedom of expression. The Advocate General concluded that where a claim for de-referencing has been successful, the search engine operator should only be required to effect de-referencing within the EU. This was a non-binding ruling. In most cases, the full court at the Grand Chamber follows the opinion of the Advocate General.

 

The Grand Chamber Decision in Case-C 507/17

The Court held that “The operator of a search engine is not required to carry out a de-referencing on all versions of its search engine. It is, however, required to carry out that de-referencing on the versions corresponding to all the Member States and to put in place measures discouraging internet users from gaining access, from one of the Member States, to the links in question which appear on versions of that search engine outside the EU.”

It went on to cite Google v Spain and stated  that the Court had already held, “ that the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful”.

Under the old Directive, and more recently under the  GDPR, Google Inc’s operations fell within the scope of EU legislation on data protection. Global de-referencing would meet the objective of protection of EU law in full, but there were other considerations. Numerous third States do not recognise the right to dereferencing or have a different approach to that right. The Court added that the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality.

Any balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world.  There was no evidence, in legal texts or anywhere else, that the EU legislature had struck such a balance. Neither was there any evidence that it had chosen to confer a scope on the rights of individuals going beyond the territory of the Member States. In addition, there was no evidence it would have intended to place a de-listing burden on an operator, such as Google, which concerns the national versions of its search engine, as distinct from those of Member States.

EU law does not provide for cooperation instruments and mechanisms as regards the scope of a de-referencing outside the EU. “Thus, the Court concludes that, currently, there is no obligation under EU law, for a search engine operator who grants a request for de-referencing made by a data subject, as the case may be, following an injunction from a supervisory or judicial authority of a Member State, to carry out such a de-referencing on all the versions of its search engine.”  Nevertheless, EU law does require a search engine operator to carry out such a de-referencing on the versions of its search engine corresponding to all the Member States.

A search engine must take sufficiently effective measures to ensure the effective protection of the data subject’s fundamental rights. What this means in practice is that any de-listing or de-referencing, “must, if necessary, be accompanied by measures which effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, through a version of that search engine outside the EU, to the links which are the subject of the request for de-referencing”.

It will be for the national court to ascertain whether the measures put in place by Google Inc. meet those requirements. Lastly, the Court points out that, while EU law does not currently require a de-referencing to be carried out on all versions of the search engine, it also does not prohibit such a practice. Just as in Google Spain, it was acknowledged that removing irrelevant and outdated links is not tantamount to deleting content. The data will still be accessible, but no longer ubiquitous.

Patrick Rowland, GDPRXpert.

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit www.gdprxpert.ie

The GDPR Gets the Blame Again.

 

GDPR has wrongly been blamed for many things since its introduction.  It has been scapegoated by sceptics, and some illogical interpretations of the regulation have led to disproportionate responses. Various interpretations, propounded by some, have no basis in data protection law and are just wrong. Nevertheless, the GDPR continues to get the blame.

Some Examples

Our No.1 is the hairdresser who cited GDPR as the reason she could not tell a customer what particular dye colour she was using in the customer’s hair!  At the time, the same customer was trying to get an appointment with another hairdresser, as her usual hairdresser could not fit her into her schedule. The customer wanted to be sure the correct dye would be used by the new temporary (perhaps to be the new permanent?) hairdresser. GDPR gets the blame!  Very inventive, but nonsense, of course.

‘Over the top’.

On the disproportionate scale is the guy who claimed to Joe Duffy that at the time of the last election, voting cards should have been shredded in front of voters once they had been presented to the election officials. One could make an exaggerated technical argument to try to support this, but there has to be a commonsense approach taken.  A, ‘verify and return’ approach is more practical and effective than a, ‘verify and destroy’ (shred) approach. How many shredding machines would have been needed in each polling station? Just think of the general layout in most polling stations.  Certainly, in the larger ones, there are a lot of different sections and rooms.

Here is a case of getting the sledgehammer to crack the nut. Shredding the cards in front of voters is an example of an action that is disproportionate to the risk to voters. The sensible thing to do, which was done by officials, was to simply hand the voting card back to the voters.  This is completely in line with the storage limitation principle from Art. 5 (1) (e), “ …kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed…” Therefore, ‘verify and return’ was the most logical and commonsense action.

Visitor Books at heritage sites

This example leads us to the story of the visitor books at certain heritage sites. Attention was first drawn to this story by an article in the Irish Times. Data protection consultants GDPRXpert are providing this link to you now. The general theme is that GDPR concerns led to the decision by the OPW to remove the visitor books from certain heritage sites. In most cases, visitors were signing their names and giving partial addresses. Some visitors included very short comments.

“The Office of Public Works observed that visitors were recording personal data, including names, addresses, etc, in visitor books at our sites which were out of view of the staff and completely unsecured,” an OPW spokesman said. A view was taken by someone at the OPW that the personal data in the books were insecure. For example, someone could take a photograph of some page or pages of the book. We don’t know who would want to do that or why, but that possibility certainly does exist. But removing the visitor books from the sites? Best to examine some of the aspects to this in more detail.

Issue 1…Personal Data in the Books.

GDPR and data protection consultants, GDPRXpert,  have set out the definition of ‘personal data’ from Art.4 (1) on their homepage.  GDPR has a wider definition of ‘personal data’ than under the old data protection acts. There is no doubt that, in accordance with the newer definition, a name or an address or both constitutes personal data.

 

Issue 2…Are personal data being processed?

Art.4(2) defines processing as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, consultation ,use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”. It is clear that the personal data from the visitor books are being processed under several of the categories of processing outlined above, e.g. collection, use, recording, storage, etc. GDPR Art.4 (2) expressly states the processing does not need to be by automated means, and so the means can be manual.

 

 

 

Issue 3…Are the data part of a filing system?

The next question is if the manual entries of the visitors (names, addresses etc), which immediately become manual records, form part of a filing system? This is a requirement under the GDPR Art. 2(1), and if this criterion is not met then GDPR does not apply.  In this context, personal data must “form part of a filing system or are intended to form part of a filing system”. The Regulation defines a filing system as, “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”. (Art.4 (6), GDPR.) In all likelihood, the details in the visitor books would fail to meet the criteria to fall under the ‘filing system’ definition.

One aspect that does not seem to have been considered or mentioned is whether the OPW viewed the personal data as data, “intended to form part of a filing system”. If so, that intent would bring the personal data under the filing system umbrella. At any time in the future the personal data in the books could be transferred into electronic form, and then would constitute, “part of a filing system”.

Only the OPW can say what the exact purposes of the visitor books were, and whether there ever were plans to transfer data to electronic form. Even at the busiest heritage sites, regularly transferring personal data from the books into electronic form would not be a taxing duty on staff. However, there is no diktat for entries in the visitor books, and many visitors simply put something brief like, ‘John, Idaho, U.S.’ Many visitors seem to concentrate on comments around their personal appraisal of the experience itself.

 

Issue 4…Lawful basis for processing personal data

We have stated it time and time again, that just because you can process personal data does not mean you should. You must have a lawful basis. So if we conclude from the foregoing that personal data are being processed, then we must look for a lawful basis. It is likely that every visitor is aware that the act of writing a name or an address or leaving some comments is entirely voluntary. In other words, they are consenting. OPW could use the consent of the visitors as a lawful basis for personal data processing.

Under the  GDPR it is not quite as simple. People whose personal data are being processed (data subjects) need to be aware of the context of the consent. Consent to what? The definition of consent is that it is an ‘ …unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action signifies agreement to the processing of personal data relating to him or her’. As part of a normal personal data processing operation,  information on the purposes of the processing and a whole host of other information has to be given to the data subject at the time the data are collected. Data controllers need to know if GDPR applies to the processing operation in the first place.

 

It is not possible to use ‘legitimate interests’ as a lawful basis. For example, it is a legitimate interest of the OPW to conduct market research to make the visitor experience more enjoyable. Comments in visitor books would be helpful in this regard. A problem with this is that it may be helpful or useful to the OPW, but not, ‘necessary for the purposes of the legitimate interests pursued’, as required under GDPR. In this instance, the OPW had stated the office had no purpose or use for the visitor books at all. This begs the question, why have them at all? OPW surely has some use for them. What does it do with them when they are full? Bear in mind, ‘storage’ qualifies as a processing operation. If there is an intention to use the personal data as part of a filing system, then the OPW should be transparent about it. Where the policy is to wait until they are full and then put them in storage, the OPW should say this.

 

This relates directly to the purpose of any processing. So, if the OPW does intend to do something with the personal data at a later stage, they should let visitors know as soon as they know themselves. It is inconceivable that no one later goes through the books to see what visitors had to say. These books offer a tool for valuable market research on visitor experiences. There are many reasons to carefully examine the visitor books. Do they want to get statistical data on visitor country of origin? What they did and did not like? Comments left in the books could positively influence the management decisions around operational practices at the sites. Somewhat strangely, in the opinion of data protection specialists GDPRXpert, the OPW told the Irish Times that they didn’t really have a purpose for processing the personal data. Therefore, as they did not have a purpose, and a purpose is required under GDPR, they discontinued the practice of placing visitor books at heritage sites.

On balance, it is unlikely that the visitor books would fall under GDPR because of the ‘filing system’ definition. There are strict requirements before something falls under the definition. It is clear visitors are giving their personal data freely. Perhaps visitors do it unthinkingly or instinctively, but in the belief, the entries will be useful in some way. They are volunteering helpful feedback for the OPW.

At the least, even if GDPR is not applicable, OPW should display a short notice beside the visitor books.  This should inform visitors that they may if they wish to do so, leave entries in the book, but advise them to keep personal details to a minimum. After all, the comments are potentially more valuable to OPW than personal details. At this time visitors should be made aware of the uses, if any, the OPW has in mind for the data. Who is going to make any entries if the notice says ‘we destroy the books every Friday at 5?’

A recommended policy is to be transparent and say something on a notice,such as, ‘we go through the comments for feedback to help improve visitor experience’. If that is the plan, it can further state that when this is done the books are archived.   If the OPW is worried about people taking photographs of entries in the books they should place a sign beside the first notice stating, ‘NO PHOTOS HERE’.  Ideally, the books could be placed at an exit point where there is normal security or staff presence.

Visitors do presume that by making an entry in the books it will be of some value to the management. They also presume that someone will, in some way, extract this value. Removing the books for data protection concerns was a complete overreaction to any potential risks. Even GDPR Art 32 made it clear that in ensuring … “a level of security appropriate to the risk …the nature, scope, context, and purposes of the processing as well as the risk…” be taken into account. Proportionality is a central concept embedded in the GDPR.  GDPRXpert, along with many data protection consultants, agreed with the DPC view that it was disproportionate.  The whole affair was an unnecessary storm in a teacup. Thankfully, reason prevailed and the books were later restored.

Patrick Rowland, GDPRXpert.ie.

We are GDPR & Data Protection Consultants with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

PSC Investigation Findings By the DPC.

Data protection consultants welcome the findings from the investigation by the DPC into the  Public Services Card. In a blog post back in February expert data protection consultants GDPRxpert  rightly predicted the main conclusions resulting from the recent investigation by the office of the DPC into the legitimacy of the Public Services Card. At the time we highlighted some of the concerns that the DPC was likely to focus on in the continuing contentious debate. The full report has not yet been made available by the Dept.of Employment Affairs and Social Protection (DEASP). However, the DPC has published some initial findings.

 

Some Backdrop

 As we stated in the earlier blog post, “Most of you will remember some controversy about this card at the time it was introduced, and it initially focused on one theory in relation to its introduction. For many, it represented no more than the introduction of an identity card by stealth. The government vehemently denied this, and different Ministers for Social Protection (Burton, Varadkar, and Doherty) regularly appeared in the media to explain and defend the purposes behind its introduction and certify its bona fides. It was just a convenient card with no other purposes than to cut down on benefit fraud and streamline operations. Everything now should work more cost- effectively and taxpayer money would be saved.” See the GDPRxpert blog post, “Public Services Card Debate Resumes” at www.gdprxpert.ie/public-services-card-debate-resumes-2/ .

 

Main Finding

 

Our earliest key finding was that the introduction of the card did have a solid lawful basis. It was underpinned by legislation. (We detail the sections under the Social Welfare Consolidation Act 2005 in our earlier blog.)  This concurs with the DPC finding. The introduction and use of the card in relation to accessing social services from the Dept of Social Protection was legitimate. That is where its lawful basis ended. What must be borne in mind is that the report was compiled in the context of events prior to the introduction of the GDPR. From a practical perspective, and because GDPR cannot be applied retrospectively, the report was based on data protection laws in force at the time. Here we refer to the Data Protection Acts 1988 and 2003 (‘the acts’). There is much in common between ‘the acts’ and the GDPR, but the GDPR has higher standards of transparency, accountability, and enforcement.

Transparency

It was partly these lower general standards, but particularly the lower standard of transparency (than under GDPR) that revealed systemic illegitimacy. Retention of supporting documentation that was demanded in support of an application was excessive. Central to this criticism was the general lack of any definitive retention period policy but instead a ‘blanket and indefinite retention of underlying documents and information provided by persons applying for a PSC’. This contravened Section 2(1)(c)(iv) of the Data Protection Acts, 1988 and 2003 because such data was being retained for periods longer than is necessary for the purposes for which it was collected. Any information provided by the Department to the public about the processing of their personal data in connection with the issuing of PSCs was not adequate. One has only to look at the information now required under Arts. 12, 13 &14 GDPR to see the depth of the lower standards under ‘the acts’.

 Other Bodies

While the Dept of Employment Affairs and Social Protection (DEASP) had at least a lawful basis for the card, other departments and public bodies did not. They just began asking for it in the normal course of business. It is more accurate to say they demanded it. They had absolutely no lawful basis for this type of demand.  Both the Passport Office and the National Driving Licence Service demanded the PSC before allowing any applications through their offices. It is those other bodies and departments that lack a lawful basis entirely, and now they must cease the practice of  demanding the PSC. There will be much discussion, especially in government circles, over the next few weeks regarding the future of the PSC.  Many data protection professionals, GDPRXpert.ie included, have formed an initial consensus that the card is likely to continue in use, but only in connection with services from DEASP.

Some Immediate Measures.

 

The DEASP, “will be required to complete the implementation of two specific measures within a period of 21 days:

  •  It will be required to stop all processing of personal data carried out in connection with the issuing of PSCs, where a PSC is being issued solely for the purpose of a transaction between a member of the public and a specified public body (i.e. a public body other than the Department itself). The corollary of this finding is that bodies other than DEASP cannot insist that a person who does not already hold a PSC must obtain one as a pre-condition of accessing public services provided by that body.
  • The Department will be required to contact those public bodies who require the production of a PSC as a pre-condition of entering into transactions with individual members of the public, to notify them that, going forward, the Department will not be in a position to issue PSCs to any member of the public who wishes to enter a transaction with (or obtain a public service from) any such public body”. (From DPC statement)

 

We will return to the topic as things develop and add to this (shorter than normal) blog post very soon.  Prompt publication of the entire report would be beneficial to all parties.

Patrick Rowland, GDPRXpert.ie

GDPRXpert, GDPR & data protection consultants, with bases in Carlow/Kilkenny and Mayo, offer a nationwide service.

P.S.  3 Sept. 2019.     The deadline passed for the Department and no report was forthcoming. Indeed,  things have altered to the extent that it is unlikely the Dept. will release the report in the foreseeable future. Most data protection consultants, such as GDPRXpert agree with the findings by the DPC. However, it seems the Government is to challenge the findings of the DPC in court having taken legal advice from the Attorney General, and externally. See I.T. article on the latest.  So the saga continues. As they say, ‘watch this space’.

P.S. No. 2     Somewhat surprisingly, just a couple of days after this postscript the Govt. did publish the report of the DPC. See Irish Times article, ” The Irish Times view on the Government defiance of the DPC”, Sept. 19, 2019. Text following is from that article.

Key findings include a decision that the card cannot be required to obtain services from other departments because no lawful basis exists for such use. It cites numerous examples of the “mission creep” by which the card transformed from its original intention as a chip-and-pin verification device for social welfare services, into a required form of identity for seemingly random purposes, such as sitting a driving test, obtaining a passport, or appealing school transport decisions.

The report states that such examples illustrate “obvious and significant deficits in terms of logic and consistency” for when the card is required.

While such findings had been released earlier in summary form by the DPC, the full report adds significant heft and leaves little legal wriggle room for the Department. Yet the Government intends to defend the card, in direct defiance of a national regulator, with both the Minister and Taoiseach Leo Varadkar suggesting that the DPC should have met with the Department to “discuss” the findings.

 

 

Schrems case drawing to a close?

 

So when is it permissible to transfer personal data to a third country or international organisations?  This is a question that has taken on new relevance. The long-running litigation by Austrian lawyer Max Schrems has moved another step towards a final resolution, following a decision in the Supreme Court on May 31st. It has once again brought the legality of transfers of personal data to 3rd countries or international organisations to the forefront of data protection discourse. (Link to Irish Times article here).  Although the Schrems litigation commenced under the old Directive rules, the GDPR is now in effect and represents the law in the area since May 2018.

A brief overview will place the most recent litigation within its relevant context. That relevant context is the transfer of personal data outside of the EU/EEA and to international organisations. A more specific context means it has to be viewed in the light of the Safe Harbour Agreement and Standard Contractual Clauses (SCCs). Back in Oct. 2017, Ms. Justice Caroline Costello gave judgment in the High Court, and in May 2018 made a referral to the Court of Justice of the European Union (CJEU) of issues to be determined by the Court. These issues related to transfers using SCCs as the transfer channel. Facebook did not want the referral to reach the CJEU and initiated an appeal grounded on procedural legal grounds. Facebook’s strategy was to question the process rather than the principles involved.

 

At its core was whether there was or is an actual right to appeal a referral to the CJEU.  In his judgment of Facebook’s appeal the Chief Justice, Mr. Frank Clarke, held that it is for the referring court, and that court alone, to decide to make a reference and whether to amend or withdraw that reference. He was satisfied it was only in limited circumstances, such as where the facts themselves were not sustainable on the evidence before the High court in accordance with Irish procedural law, that any aspect of the High Court judgment could be overturned. Facebook was criticising the ‘proper characterisation of the underlying facts’, not the facts themselves, he said.

Ms. Justice Costello had sought to have clarifications on issues that spoke to the validity of the data transfer channels known as Standard Contractual Clauses (SCC). She had 11 questions that she needed the CJEU to answer concerning a European Commission decision to approve the SCC’s in the first place. Whether or not the measures provided for under  Privacy Shield were comparable to the remedy available to EU citizens under Art.47 of the EU Charter for breach of data protection rights was one point raised by the DPC in the High Court case. Privacy Shield replaced the Safe Harbours Privacy Principles, elements of which formed the basis of complaint for Max Schrems in some of his litigation. For more information on Privacy Shield click here.

We have referred in previous blogs to the notion of the balancing of the data subjects’ rights where their data is being processed. In the context of rights and personal data processing, all rights are taken into account, not just data protection rights.  GDPR was not in effect at the time of the litigation commenced by Schrems and hence the reference to the EU Charter and, in particular, Arts. 7, 8 and 47. (Article 7 provides that “everyone has the right to respect for his or her private and family life, home and communications.” Article 8 states “everyone has the right to the protection of personal data concerning him or her,” and mandates that such data must be “processed fairly for specified purposes and on the basis of the person concerned or some other legitimate basis laid down by law.”

According to Article 7, “everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.” Article 8 further authorizes enforcement of the rules via independent authority. Article 47 guarantees a “right to an effective remedy before a tribunal” to “[e]veryone whose rights and freedoms [are] guaranteed by the law of the Union.” It also requires a “fair and public hearing within a reasonable time by an independent and impartial tribunal previously established by law.”)

The revelations by Edward Snowden in 2013 gave insights into the massive extent of the interception and surveillance of internet and telecommunications systems by the US National Security Agency. It was not just that these actions were disproportionate, but that they infringed upon the very right to privacy. At the time of the Snowden revelations, data transfers to the US were being governed by the so- called, ‘Safe Harbours Agreement’.Despite this agreement, Schrems had concerns about both Facebook’s transfer of his personal data to the US, and processing of those data by American authorities.

A position taken by the DPC was that once an adequacy decision (here, the Safe Harbours Agreement) had been issued, the office had no part in investigating a complaint. Safe Harbours itself stood as testament to the adequacy of the protection of transfers of personal data to the US. Mr. Justice Hogan in the High Court thought Schrems was objecting more ‘to the terms of the Safe Harbour regime itself’, than to the DPC’s application of it. (Schrems v DPC [2014] IEHC 310 (18 June 2014) Para.69). This is often referred to as Schrems No.1.

Another position taken by the DPC was that the complaint was essentially speculative and hypothetical in nature. Mr. Justice Hogan took the view that there was no need to establish that the applicant had even grounds to suspect such a breach had occurred. It was enough to believe the mere absence of controls might lead to a breach of the applicant’s rights. If the matter was solely governed by Irish law significant issues would have arisen under the constitutional right to privacy. Mr Justice Hogan referred the case to the CJEU partly on the basis that, ‘in reality, on that key issue Irish law has been pre-empted by general EU law in the area…’ (Schrems, as above, at paras. 78-80). In hindsight, this reference to the CJEU was the beginning of the end for the Safe Harbours agreement.

CJEU Case C-362/14 (6 Oct.2015)

It has to be borne in mind that the case before the Court dates back to Directive 95/46 days, pre-GDPR, that is. One definitive finding by the Court was that the DPC (or any National Supervisory Authority) when examining a claim concerning the compatibility of a Commission decision with the protection of the privacy rights and fundamental rights of an individual cannot declare the decision invalid themselves (of course, neither can the national courts). Where a national supervisory authority, such as the DPC, comes to the conclusion that the complaint is unfounded, the complainant must have, in accordance with Art. 47 of the EU Charter, access to judicial remedies enabling a challenge to be made before the national courts. The court must stay proceedings and make a reference to the CJEU for a preliminary ruling on validity, where the court is of the opinion that some grounds for invalidity are well founded. In addition ,the national courts themselves can raise issues of their own motion.

In the converse situation, where the Supervisory Authority (SA) is of the opinion that the objections of a person lodging a complaint are well-founded, then the SA must put forward those objections in order for a national court to adjudicate upon them. A reference to the CJEU for a preliminary ruling can be made where a national court shares the doubts as to the validity of a decision. The Court ultimately found the Safe Harbours agreement invalid, mainly because the Commission had not made, ‘any finding regarding the existence , in the United States, of rules adopted by the State intended to limit any interference with those rights  and without referring to the existence of effective legal protection against interference of that kind’. United States’ authorities were, ‘able to process the personal data transferred  …and process the data in a way incompatible, in particular, with the purposes for which they were transferred…data subjects had no administrative or judicial means of redress…’( at paragraph  90). Without appropriate safeguards in place, that mirror or match safeguards under EU law, there can be no adequacy.

 

Later on 20th Oct 2015, the proceedings were returned before the High Court and the decision of the CJEU was implemented by the making of an order setting aside the decision of the DPC not to investigate the original complaint of June 2013. The High Court then remitted the original complaint back to the DPC for investigation. Immediately following the High Court order Mr.Schrems re-formulated and resubmitted his complaint to take into account the fact that Safe Harbour had been struck down. Having considered the matter the DPC decided to proceed on the basis of the new formulation. During its investigation, the DPC established that Facebook, and many internet companies, continued to transfer personal data to the U.S. in large part by means of Standard Contractual Clauses (SCCs). These are pro forma agreements which have been approved by way of certain EU Commission decisions, as providing adequate data protection for the purpose of transferring personal data to third countries.

On 24 May 2016, the DPC issued a draft decision to Schrems and Facebook informing both that the preliminary decision was the complaint was well-founded but further submissions were invited from both parties. Three reasons were given by the DPC :

  1. a) A legal remedy compatible with Article 47 of the Charter is not available in the US to EU citizens whose data is transferred to the US where it may be at risk of being accessed and processed by US State agencies for national security purposes in a manner incompatible with Articles 7 and 8 of the Charter;

(b) The SCCs do not address the CJEU’s objections concerning the absence of an effective remedy compatible with the requirements of Article 47 of the Charter as outlined in its judgment of 6 October 2015, nor could they; and,

(c) The SCCs themselves are therefore considered likely to offend against Article 47 insofar as they purport to legitimise the transfer of the personal data of EU citizens to the US.

The DPC, therefore, commenced legal proceedings in the Irish High Court seeking a declaration as to the validity of the EU Commission decisions concerning SCCs and a preliminary reference to the CJEU on this issue. Both Facebook and Mr. Schrems were named, as the joining of these parties affords them an opportunity (but not an obligation) to fully participate if they so wish and to make submissions in the case. All of this brings us back to the High Court and the decision by Ms Justice Costello to make a reference to the CJEU. She had also refused to put a stay on the reference to the CJEU, but Facebook then took things to the Supreme Court As detailed earlier, Facebok’s appeal against the reference has been dismissed in the Supreme Court.

Soon it will be back to the CJEU. As it stands, it will be some time before we know whether the Standard Contractual Clauses at issue will hold up as legally sound channels of personal data transfer, in particular, to the United States. One can hypothesise about the interpretation the CJEU will favour, but whatever it is will have a bearing on future interpretation of the channels of transfer under the new GDPR regime.

In an upcoming blog, we will look through the lens of the GDPR to focus on the means by which personal data can now be legally transferred to third countries and international organisations. Future interpretations will be informed by the final decision of the CJEU on the Standard Contractual Clauses reference that is soon to be in that court.

Patrick Rowland, GDPRXpert.ie

 

Right to Rectification and Principle of Accuracy

The right to rectification and the right of access were (and still are) guaranteed under the Charter of Fundamental of the European Union. Art. 8(2), “Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified. “ The Charter has applied to the EU since the entry into force of the Lisbon Treaty on 1 December 2009 and so it predates the GDPR. It was Art.16 of The Treaty on the Functioning of the European Union (TFEU) which imposed the specific obligation on the EU legislature to actually make data protection rules, and it was this that eventually led to the GDPR.

Art.16 GDPR sets out in stronger and clearer language the right to rectification. It is a right that is wisely read in conjunction with the principle of accuracy under Art. 5(1) (d) of the GDPR.  As an individual data protection principle, the principle of accuracy stands alone only in the text itself. It is intertwined with all the other principles to form a greater whole. Article 5(1)(d) states that personal data shall be, “accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are process, are erased or rectified without delay  ( ‘accuracy’)”.  Let us remember that Art. 15, the right of access, is often the starting point for other requests. For example, it is this same article that facilitates other rights, because it gives the data subject the right to obtain confirmation on whether his or her data are being processed in the first place. If in fact personal data are being processed, then the data subject can have inaccurate data rectified or have incomplete data made complete. Sometimes this is best achieved, and facilitated, by means of a short supplementary statement. Therefore, Art.15 can work in two ways: 1) by completing incomplete data; 2) by rectifying inaccurate data. In the case of Max Schrems,  (  Case C-362/14) ( 16 Oct.2015), one of the defects identified by the CJEU was that there “was no means of enabling the data concerning the data subject to be accessed and, as the case may be, rectified or erased”( At para.90). The starting point is again, the knowledge that personal data are being processed in the first place.

Under the old Data Protection Acts (‘the acts) many complaints were received and processed concerning the right to rectification and the right to erasure of inaccurate data. Not many cases have come up for scrutiny since the introduction of GDPR, but future case types will likely mirror some from the pre-GDPR days. Sometimes looking back can act as an accurate guide to what may occur in the future. Below are some interesting cases that contain scenarios and circumstances that could resurface. They will give a taste of the substance of the right.

EMI Records v The Data Protection Commissioner [2012] IEHC 264

This case was a leading case on the processing of inaccurate personal data and went to court under the old acts.

Brief Facts:  Eircom a telecommunications provider had been operating a scheme whereby recording companies were detecting on the internet those who were uploading their copyrighted music and video. The recording company passed on the information consisting of copyright title, time and temporary IPM address to Eircom. Eircom then wrote to their subscribers reminding them that downloading copyrighted material was in breach of their subscriber contract. Those who continued illegally downloading would have to find a new telecommunications provider as Eircom would no longer provide internet service.

In October 2010 Eircom forgot to change the clocks to reflect new wintertime. As a result, it wrongfully identified some people as illegally downloading, when they were not. The DPC issued an enforcement notice at the time directing that it cease its activities. The case gives a sense of what might be viewed as inaccurate processing of personal data. In this instance , the practice ceased. This was despite the fact the judge in the High Court found the enforcement notice from the DPC contained ‘no reasons whatsoever’, and ruled it invalid.

Smeaton v Equifax plc [2013] EWCA Civ 108 (20 Feb. 2013)

This case throws up some interesting issues and perspectives on the concept of accuracy. The defendant was a UK credit registry whose database indicated the plaintiff was subject to a bankruptcy order. In fact, the order had been made originally at first instance, but stayed upon appeal, and then rescinded. The plaintiff claimed for losses and damages resulting from the inaccuracy. Initially, the claim was successful but overturned on appeal.

What makes the case unusual is that the plaintiff had acted as a lay litigant in challenging the bankruptcy order. Generally, in cases such as this a solicitor would represent the litigant and inform the credit registry the client had been discharged. Smeaton’s argument was that Equifax should have been aware of the discharge, notwithstanding his self-representation. Again, it has to be stressed that this was an unusual case and decided on its own particular facts. The Court recognised that the old English Data Protection Act 1988 did,   “not impose an absolute and unqualified obligation on Credit Reporting Agencies to ensure the entire accuracy of the data they maintain. Questions of reasonableness arise”.

An important consideration when questioning certain rights, if not all rights, under the GDPR is to realise that the extent of a right, and the degree to which it may be vindicated, may in the more contentious case go all the way to the final arbiter, the CJEU. Proportionality and the balancing of rights are paramount under EU law. It is only when it reaches this forum that jurisprudential reasoning truly asserts itself. The CJEU will define the parameters and specific meanings of words in the legislative text.    Even though Smeaton v Equifax goes back to 2013, it is still good authority for the proposition that controllers are not under an absolute duty to ensure the accuracy of their data.

There have been 30 cases cited in the first annual report of the DPC since the introduction of GDPR in relation to the right to rectification in the first annual report of the DPC since the introduction of the GDPR, available here.

Case Study 3/2018.

Again this is one of the recent cases from the DPC Annual  Report 2018 that highlights the close relationship between the accuracy principle and the Right to Rectification. The DPC received a complaint from a Ryanair customer whose webchat details were erroneously sent to another Ryanair web chat user. Of course, issues of integrity and confidentiality come into play also. On the date in question, the data processor received requests from four Ryanair customers for transcripts of their web-chats, all of which were processed by the same agent. However, the agent did not correctly change the recipient email address when sending each transcript so that they were sent to the wrong recipients. Included among the recommendations was one that recipient e-mails should be changed to ensure accuracy and using the autofill function in their software with extreme caution. Ryanair subsequently informed the DPC that their autofill function in their live web chat system had been disabled by their data processor.

Perhaps it is due to the nature of the business, and a strong desire for expediency, that credit reference agencies have historically been disproportionately involved in breaches, compared to other businesses. We can look at a few of the more interesting ones.

Case Study 2/1997

This complaint was received concerning the combination of data about two different people into the database of a credit reference agency. Human error was at fault, as the two individuals lived in the same area and had the same names. At the time, the credit reference agency had a policy of matching up similar data. A particular financial institution was supplying personal data to the agency, but between the two records became intermingled. The DPC upheld the complaint.

Case Study 6/1999.

A principle seldom becomes obsolete unless legislative action deems it so. At issue here was an issue that remains a problem in the context of personal data processing. The complainant had repaid a loan, but the credit reference agency’s files showed the loan as a default. For clarity, we are still talking here about provisions under the ‘old acts’,  but as was found in this case, not keeping records “up to date” is a breach under the GDPR.

Case Study 8/1997

A credit reference agency’s records showed that the complainant had had a loan written off. That was correct. It also stated that litigation was pending for the non-payment of the loan. This part of the record was incorrect. No action was pending. As a result of the investigation, the DPC found the record held, “was inaccurate in stating that litigation was pending”. This case shows that even though the agency had some factually correct personal data, and few would advance monies to the complainant on the basis of the default, there was an inaccuracy in their records.

Case Study 6/1999

 Inaccurate credit rating assessments of a complainant gave rise to this case. Three loans had been taken out by the defendant and all three had been fully paid off. However, the agency wrongly recorded one as still outstanding. What was stated by the DPC remains true, and it is that there is a “clear and active obligation on data controllers to ensure that data is kept accurate and up to date”. The concept of ‘reasonableness’, referred to above in Smeaton v Equifax, is an abiding concept.

Case Study 12/2009.

 Here the results of a paternity test, a very sensitive issue, were sent to the wrong address. They were read by the complainant’s neighbour who now knew that his neighbour was not the father of child X.

 

Case Study 18/2009

What happened here was that a court summons was incorrectly served. It was served to the wrong person. As far as I remember, this was another that ended up at a neighbour’s house. Something most of us would naturally prefer not to happen.

Recently, (30th April 2019), The DPC issued an examination of the right to rectification complaints and it is accessible here. At its core is an attempt to clarify aspects of the right to rectification. As we mentioned above, there is a strong relationship between the right to rectification and the principle of accuracy. What the DPC notes is that ” Individuals have a right to rectification of their personal data under data protection legislation. What the right to rectification means in practice will depend on the circumstances of each case and the Data Protection Commission (DPC) examines each case that comes before it on its individual merits.” In practice, this means that all data controllers will be required to take all reasonable steps to ensure the accuracy of the personal data, taking account of the circumstance of the processing, the nature of the personal data and in particular, the purposes for which they are processed.

“In respect of complaints received by the DPC in relation to the recording of a name without diacritical marks, e.g. the síneadh fada in the Irish language, consideration has to be given, in light of Article 5(1)(d)  and Article 16 GDPR, to whether the recording of a name without diacritical marks is deemed to be inaccurate, having regard to the purposes for which the data (in this case, a data subject’s name) are processed”. This is a reference to the Ciarán Ó Cofaigh case reported in the Irish Times here.  What if a  John Coyle ( with excellent credit rating) had credit record details that identified him as a John Boyle with poor credit rating?   Is there really a difference between a mistaken letter in a person’s name and a missing fada, especially where the omission or the mistake can  result in a detriment to the data subject?  ( Or in this case,  is it discrimination against a Gaeilgeoir?) Your name is either correct or not correct, and this is not a hair-splitting exercise. Simple mistakes happen, but they must be rectified and made accurate before there is a detriment to the data subject.

“In a related context, the European Court of Human Rights has concluded that the omission of diacritical marks from a person’s name in certain official documents did not entail a breach of the right to private and family life guaranteed under Article 8 of the European Convention on Human Rights: see, for example, Šiškins and Šiškina v Latvia (Application no. 59727/00, 8 November 2001).” Expect more related cases, but under the GDPR these will be going to the CJEU.

Patrick Rowland, GDPRXpert. ie

Data Protection Consultants, GDPRXpert,  based in Carlow/Kilkenny and Mayo, provide a nationwide service.

Visit www.gdprxpert.ie to learn more. 

 

DPC and Facebook Square off Again.

In our blog of March 21 st we did a general overview of some of the problems facing Facebook, most notably in the U.S, and involving various regulatory bodies. At that time we alluded to the fact of pending trouble this side of the pond. In the infamous and immortal words of the legendary American baseball player, Yogi Berra, it seems very much like, “it’s déjà vu all over again”. Reports from the office of the DPC concerning developments in its investigations would seem to bear this out.

In the same blog, and in reference to the first Annual DPC Report of the new DPC, we had pointed out the substantial number of data breaches reported by multinationals. Facebook was one of those multinationals and the Facebook Token breach became subject to a statutory inquiry in Sept. last year. Now a report confirms that Facebook, or one of its subsidiaries, has had 11 statutory inquiries by the office of the DPC initiated against it over varying periods.  (See the full article by Adrian Weckler, Technology Editor, Irish Independent.) It is a confrontation that seems endless.

In the Left Corner, Weighing in at...

As part of an ongoing investigation by the Justice Department’s securities fraud unit, Facebook now expects to pay between $3bn and $5bn. Political consulting company Cambridge Analytica had improperly obtained the Facebook data of 87 million users and used the data to build tools that helped Trump’s campaign in 2016.  (For more details, refer to our previous blog on the American investigations here). At the centre of the current probe is the admission by Facebook in its notification to the DPC, that millions of passwords were stored in totally unsecure ‘plain text format’. Facebook had discovered, “that hundreds of millions of user passwords, relating to users of Facebook, Facebook Lite and Instagram, were stored by Facebook in plain text format in its internal servers,” said a statement from the Irish DPC.

Dangerous Tactics

Storage of passwords in this manner leaves them especially exposed to those with access to certain internal services. It is always recommended, and it is good practice, to store passwords in an encrypted format, thereby allowing websites to confirm what you are entering without actually reading it. What is normal practice is that a password is ‘hashed’ and ‘salted’ which includes using a function called “scrypt” as well as a cryptographic key.

In cryptography, a ‘salt’ is random data that is used as an additional input to a one- way function that ‘hashes’ data, a password or passphrase. This allows the data security team to irreversibly replace a user’s actual password with a random set of characters. With this procedure, a user logging in is validated as having the correct password, without any need to resort to storing the password in plain text.  Hardly something to be considered ultra high tech or ‘rocket science’ for the average IT and  Data Security team!

The Bell for the End of Round 11.

Somewhat surprisingly, Facebook’s ‘bottom line’ does not seem to be suffering as badly as analysts had been predicting. Sales were up 26% for the first quarter of 2019 to close to $16bn. User numbers also increased, but at a lower rate of just 8%. Market analysts also got Facebook share price expectations wrong. In the year to date, Facebook shares have risen 40%, outperforming much of the wider market. It still has 2.38 billion account holders. Ultimately, much could change when the results of all the investigations become public knowledge. What will the public’s perception of Facebook’s privacy and data protection policy be when all investigations conclude?  Negative public sentiments have so far not affected Facebook’s bottom line. People are creatures of habit and change can sometimes be excessively challenging and inconvenient.

These investigations are warning signs for the company and investors alike. Maybe Rick Ackerman’s insight may be more prophetic than speculative. “Even the rabid weasels who drive the company’s shares wildly up and down for fun and profit must be sensing by now that Facebook is no longer cool (think AOL) and that the company has seriously depleted its store of goodwill”. (https://www.fxstreet.com/analysis/more-bad-news-cools-facebooks-rampage-201904032337) Strong words, but a UK government report found Facebook had behaved like ‘digital gangsters’. https://www.npr.org/2019/02/18/695729829/facebook-has-been-behaving-like-digital-gangsters-u-k-parliament-report-says?t=1556562850116

 

Winners and Losers?

In a post back in March, Mark Zuckerberg stated, “I believe the future of communication will increasingly shift to private, encrypted services, where people can be confident what they say to each other stays secure, and their messages and content won’t stick around forever.  This is the future I hope we will help bring about.” Their business practices will have to go through at least an ethical overhaul.  At present, they rely on a $55bn advertising revenue stream that comes from products and services that do not have end to end encryption. They are not private to any substantive and quantifiable measure. Yet if the business model is also under increased pressure from data protection and privacy regulators in different jurisdictions, then in theory at least, it must change sooner rather than later.

If this model is to be replaced, the question must be what type or form it will take? Most analysts suggest that Whatsapp and Messenger are the future because Facebook’s data show that is where people are increasingly spending their time. If people move to private messaging apps with high levels of encryption, as Zuckerberg stated is the future and their policy, Facebook will still need data to use to target people with ads. What will be the source of the data needed to target these people? More relevant is whether or not this will be done in accordance with GDPR and data protection legislation?  Will their business model stay substantially the same, just being delivered by different vehicles? Will it be ‘free’? A subscription model is unlikely because how many people will actually pay for ‘likes’, and to interact with their ‘friends’? (No ads, but…)

Future Re-Match

It is conceivable that for many years to come, as one inquiry ends, another starts. Maybe it will be a case of, “the more things change, the more things stay the same”. Cost-benefit analyses done by Facebook may be adjusted once the fines begin to mount up. One certainty is that Facebook will not be allowed to disregard the GDPR and privacy legislation in numerous jurisdictions. Thankfully, for data protection and privacy advocates, the office of the DPC is committed to its mission. It is seriously ‘punching above its weight’.  Facebook will find, like many before it, that its financial resources do not afford it special treatment, or confer special status in the eyes of the law.

P.S. For another angle on the subscription ideas see https://techcrunch.com/2018/02/17/facebook-subscription/?guccounter=1&guce_referrer_us=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_cs=Rhxc73OHhIIwVotsx2PW0w

Patrick Rowland, GDPRxpert.ie.

We are Data Protection consultants, based in Carlow/Kilkenny and Mayo, offering a nationwide service.

Visit www.gdprxpert.ie to learn more

 

The GDPR and Elections on the Horizon

What with the never-ending Brexit saga and the continuing toxic political stalemate, it may not be popular to delve into any topic with political associations. Nevertheless, this is the intention, and it is one that is primarily inspired by the upcoming European and Irish local elections. Both sets of elections have a novel element, to the extent that they will be the first Euro and local elections to take place since the introduction of the GDPR and The Data Protection Act 2018.

Key actors on this changed set include a number that may play the role of the data controller. In previous blogs and on our website,  we have seen how the notion of accountability of controllers and joint controllers is a central feature of the GDPR. Individual election candidates, political parties, data analytics companies and public authorities responsible for the electoral process can all act as controllers. It is not within the scope of this blog to discuss all facets, and so the less ambitious plan is to look at election candidates in the light of canvassing related activities. Data protection issues arise whenever personal data is being collected, and at election times it is collected in different forms. Canvassing door to door, direct mail marketing, and electronic direct marketing may raise concerns. More data protection issues surface in relation to requests for representation. Inevitably, organisations that receive these requests must also come under the same scrutiny.

 

Getting Started.

The focus of this article lies within the confined context of elections and electoral activities, and the application of Union and National law within this defined landscape.  Micro-targeting of voters by unlawful processing of personal data is still fresh in peoples’ minds following on from Cambridge Analytica and other similar disclosures. A starting point is to acknowledge, as stated by the UK’s Information Commission Office (ICO), that, “engaging voters is important in a healthy democracy, and in order to do that, political parties, referendum campaigners and candidates will campaign using a variety of communication methods. However, they must comply with the law when doing so; this includes the handling of the personal data that they collect and hold”. (ICO, Guidance on Political Campaigning; more details here). This is especially true since the inception of the GDPR, The Data Protection Act 2018 and the E-Privacy Regulation.

The Very Basic Ground Rules.

Individuals now have enhanced rights, and these are strengthened and particularly relevant in the electoral context. These rights place onerous responsibilities on candidates seeking office. Primary responsibilities fall on the candidates and affiliated parties where that is the relationship. Public authorities also have responsibilities under the GDPR and the various Electoral Acts. Whoever is processing the personal data, previously viewed as purely mundane, such as names and addresses, must now pay more attention. Simple names and addresses represent ‘personal data’ under the GDPR. Processing of such data must now be done lawfully, fairly and in a transparent manner, and for a defined specified purpose.  Another limitation (purpose limitation) means the personal data are now less likely to be strategically stored with ulterior motives in mind. Data cannot be further processed in a manner incompatible with the purposes for which the data were initially collected. (Note: a few strict exceptions to this rule). There must be some lawful basis for personal data processing. All data protection principles must be followed without any ‘cherry-picking’.

Pre-election days.

Most of us are well used to the barrage of literature that ends up on our hall floors in the run-up to elections. S.39 DPA 2018 specifically allows for the use of personal data for the purpose of communication in writing (including newsletter or circular) with the data subject.  It is qualified and is limited to  ‘specified persons’, namely: a political party; a member of either house of the Oireachtas, the European Parliament or a local authority; or a candidate for election to the office of  The President of Ireland or for membership of any of the above mentioned. Here the DPA 2018 provides a lawful basis for processing. There is a useful guidance booklet available at the DPC website.  Section 59 DPA 2018 expressly modifies Art. 21 GDPR, so that there is no right to object to electoral direct marketing by post. When communicating by text, e-mail, phone or fax a candidate must have prior consent from the constituent. If contact is then lawfully made, it must be clear about its origin, and it must incorporate an easy opt- out.

Frantic times precede elections and normal rules should apply, but it is probably unrealistic for the office of the DPC to expect candidates to include the amount of information described within this pamphlet with their canvassing materials. What is more unrealistic is to expect the same information to be given by a candidate when going door to door. Time constraints make this impractical. It will be interesting to hear from candidates after the elections about how the new regulations and the DPA 2018 affected their campaigns. Just as interesting will be any feedback from constituents concerning the information they were given regarding data protection rights from candidates. Under Art.9 GDPR there is a general prohibition on the processing of special categories of personal data but there are exceptions to the rule. Section 48 DPA 2018 expressly permits one of these special categories of personal data (revealing political opinions) to be processed. Provided safeguards are taken to protect the data subject’s fundamental rights and freedoms,  such data can be processed in the course of electoral activities by a political party, a candidate for election to, or a holder of, elective political office in the State. (Note: applies to Referendum Commission also). The specific purpose is the compiling of data on people’s political opinions. Section 48 DPA 2018 provides a lawful basis.

 

Requests for Representation

In the course of canvassing at election time, candidates receive numerous requests for representation regarding access to services or the provision of services. Whilst such requests are genuine, they represent a higher level test of a candidate’s knowledge or ability, as perceived by the voter, in the run-up to the election. If a voter gets a favourable and swift response from the candidate the chances are he/she will also get a vote. It is only proper that no short cuts are taken to get this information quickly before Election Day.  When we speak of ‘candidates’, it is important to distinguish between candidates who are current officeholders seeking re-election, and those running for election who do not currently hold any office.  All elected representatives should be aware of, or quickly become familiar with S. 40 Data Protection Act (DPA) 2018. In fact, they should be more conscious of, and extra vigilant in their responsibilities because the number of requests will increase exponentially at election times. Being knowledgeable on S.40 potentially benefits the representative’s reputation. Passing on the relevant specifics of that knowledge to the constituent will deliver benefits to both parties. After the introduction of the GDPR, the office of the DPC issued interpretive guidelines on S. 40 DPA 2018.  Data protection consultants GDPRXpert have the document available here.  We will now look at some of the main points from the guidelines.

 

Some Guidelines on Requests for Representation.

Sections 40(1) and (2) DPA 2018 gives the legislative basis to elected representatives for the processing of personal data of constituents. This includes the special categories of personal data from Art. 6 GDPR. Processing is allowed where the elected representative either receives a request for representation directly from the data subject or where the elected representative receives a request for representation from another person on behalf of the data subject. In all cases, the elected representative must be able to demonstrate that they are compliant with the principles of data protection.  At a minimum, representatives are obligated to meet their transparency responsibilities set out in the GDPR (especially Arts. 12, 13 &14). An elected representative has an obligation to be certain at all times they are acting upon a request from the voter.

 

There will be many situations where “the permission can be implied from the relevant action or request. For example, the raising of the matter by an individual will create an expectation that their personal data will be further processed by the elected representative and other relevant organisations”. (DPC Guidelines 2018 p.4)  A normal expectation is that personal data will be processed.  As part of the representative‘s request for information, the local council, for example, will disclose personal details necessary to satisfy the request. However, best practice is to be sure that the constituent is aware of the likely processing, and we recommend a signed consent form. In many instances, a formal, signed consent form may not be practical. Contemporaneous detailed notes should be taken by the representative, and the DPC suggests this as a good record to demonstrate compliance with S.40DPA 2018.

If any unexpected processing becomes necessary it is advisable to revert to the constituent. An elected representative must be careful not to go beyond the specified purposes for which the consent was given. One recommendation from the DPC is that elected representatives should use Privacy Notices when they collect personal details from people and have a Privacy Notice on their website. All notices should meet the transparency requirements and “satisfactorily address the requirements set out in Articles 12, 13, 14 &30 (where relevant) of the GDPR and also should be clear, accessible and informative to help people understand what will be done with their personal information”. (Office of the DPC, 2018) Simple, best advice: be straight with people on all aspects. Following data protection principles will operate to safeguard both the constituent and the elected representative.

 

Requests From Someone on Behalf of Someone Else

In this scenario, all parties should be extra cautious. Here a request is being made by one person on behalf of another. Common situations include son/daughter on behalf of one or both parents; some family member on behalf of another family member; relative on behalf of another relative; neighbour/friend on behalf of another neighbour/friend, etc. It is no longer sufficient to take the word of one party and accept the bona fides.  Therefore, the elected representative will have to ensure that the individual making the request has the authority from the person whose personal data will be processed on foot of the request. This is a potential minefield and the onus lies on the representative to “demonstrate the data subject has consented to the processing of his or her personal data” (Art.7 (1) GDPR).

Trust is no longer a reliable basis on which to proceed with such a request. Other aspects that merit detailed attention include the competency of the data subject and the legal standing of the person making the request.  For example, is there an enduring power of attorney to manage the affairs of the data subject? Any prudent representative should strive to have a signed consent form provided. Failing that, it will be a decision for the representative whether or not to make a representation.  Where the representative has not been able to fully ascertain the wishes of the individual prior to processing of personal data, he or she should have set out and recorded the specific steps taken to ascertain those wishes. Such records will stand as evidence of reasonable efforts having been made.  This will be crucial at the time the representation is made to the appropriate organisation.

 

Disclosure by an Organisation following a Request under S.40 DPA 2018.

Written Requests

As noted earlier, there is an exponential increase in the number of requests for representation approaching election time.   Section 40 (4) DPA 2018 gives an organisation the legal basis to respond to and process the personal data on foot of a representation from the elected representative. In doing so the organisation must demonstrate compliance with all the data protection principles under Art. 5 GDPR. A precondition is that the disclosure is necessary and proportionate to enable the representative to deal with the request, and safeguards referred to in S. 36 DPA 2018 are taken. Special categories of personal data are allowed to be processed by the organisation under S. 40(4). Where the organisation receives a written representation on foot of S.40 the organisation can assume the constituent has given permission. In other words, it can accept the bona fides of the representative while at the same time satisfying itself it is reasonable to assume the individual would have no objection to the release of the personal data.

Verbal Requests

With verbal representations from an elected representative to an organisation,  it is advisable that a staff member of the organisation logs appropriate details.  Where the elected representative is present when the representation is made it is good practice to have a short form confirming the details signed by him /her. Best practice is for the organisation to have policies and privacy notices in place that outlines how the organisation deals with requests. Ultimately, the organisation decides whether to accede to requests made. In particular, the organisation must ensure they meet responsibilities under Art. 12, 13, 14 and 30 GDPR. Any disclosure must be only what is necessary and proportionate in its impact on the fundamental rights of the individual. An organisation must consider the potential impact and negative implications of any representation and take safeguards to mitigate any risks.

Mitigating Risks

Mitigating risk must reach a higher level of security in the context of special categories of personal data. These are by their nature sensitive. Extra safeguards are advisable where the representation has been made on behalf of the data subject by another individual due to incapacity or age. Where the personal data falls under the special category class, any safeguards must be strengthened.  It must always be borne in mind that reliance on S.40 (4) DPA 2018 as a legal basis to disclose data on foot of a representation is dependent on certain conditions being met in advance: any processing must be necessary and proportionate and suitable measures must be taken to protect the individual’s rights and freedoms. If an organisation acting on foot of a representation has any concern about the level of awareness on the part of the representative or individual, in relation to the sensitive nature of the personal data, it would be prudent to refer back to both. It is only proper that the individual is fully aware of the implications that will follow the processing of their personal data as a consequence of the request. Both the nature and purpose of the request will influence actions taken. For example, some requests may be time-sensitive and getting explicit consent may not be practical. The DPC advises a common-sense approach be taken.

 

Personal Data of Third Parties

As a general rule, it is not permissible to process the personal data of third parties under S. 40 DPA 2018. This is allowed under very limited circumstances. If a third party has not been involved in a request for representation processing of  personal data of that third party will not be permissible unless one of the following apply: the third party cannot give explicit consent; the processing is necessary in somebody else’s interest and explicit consent has been “ unreasonably withheld” by the third party; the balance favours the disclosure in the public interest; the elected representative “cannot reasonably be expected to obtain” the third party’s explicit consent; seeking the third party’s explicit consent would “prejudice the action taken by the elected representative.

Other Considerations Re Special Category Data

Earlier we noted how S. 48 DPA 2018 allowed for one category (personal data revealing political opinions ) within the ‘special categories’ grouping. However, S. 40 (1) DPA 2018 allows the general processing of personal data within these special categories. The elected representative in processing such categories must “impose limitations on access to that data to prevent unauthorised consultation, alteration, disclosure or erasure of that data” (S.40 (3) DPA 2018).  In conjunction with these limitations, suitable and specific measures that take on board the provisions of the Data Protection Health  Regulations ( S.I No.82/1989 and S.I. No. 83/1989) should be considered, as these remain in force under S. 58 DPA 2018. Both equally apply to the elected representative and the organisation receiving the representation. These regulations provide that health data relating to an individual should not be made available to an individual, in response to an access request, if that would be likely to cause serious harm to the physical or mental health of the individual.

If a person is not a health care professional, he or she should not disclose health data to an individual without first consulting that individual’s own doctor, or some other suitably qualified healthcare professional. Where it has been deemed appropriate to disclose such information to an elected representative it should include a warning in regard to the sensitive nature of the data. The elected representative will need to apply safeguards outlined in S. 40 (3) DPA 2018. Finally, in relation to processing of personal data that involves criminal convictions or offences (Art. 10 data), any disclosure on foot of a representation will necessitate an assurance from the representative that explicit consent has been obtained for the request.

Much of the foregoing is evidence of the complicated nature of data protection in the context of electoral activities.  A high level of awareness is expected from elected representatives and from organisations that receive representations from them. Once the relevant information is provided by the elected representative a decision should be common sense based. The Office of the DPC believes any refusal by the organisation should be easily explained by reference to S.40 DPA, without citing data protection requirements as a general ground for refusal. Where the organisation has followed S.40 (4) DPA 2018, the GDPR, data protection principles and implemented suitable and specific safeguards it should be confident it has acted in compliance with DPA 2018.

Patrick Rowland, GDPRXpert.ie

Data Protection Consultants GDPRXpert.ie, with bases in Carlow/Kilkenny and Mayo, offer their expert service nationwide.

Visit www.gdprxpert.ie to learn more.

 

More Problems for Facebook

 

Facebook has not been unused to controversy, especially over the last year. In our most recent blog in relation to the first Annual Report of the new DPC, we pointed out the substantial number of data breaches reported by multinationals. Facebook was one of those multinationals, and the Facebook Token breach became subject to a statutory inquiry by the office of the DPC  in Sept. last year.  Now, in the US, federal prosecutors are conducting an investigation into data deals Facebook struck with some of the world’s largest technology companies. (NY Times, March 13, 2019 https://www.nytimes.com/2019/03/13/technology/facebook-data-deals-investigation.html)

Grand Jury Investigation.

A grand Jury in NY has subpoenaed records from at least two prominent makers of smart phones and other devices. Partnerships with Facebook gave these makers very broad access to the personal information of possibly hundreds of millions of Facebook users. This had been going on for years,  and operated to allow the makers, along with  companies such as Microsoft , Apple, Sony and Amazon, to see users’ friends contact information and other information, most often without any consent. These agreements were previously reported  in The New York Times. (Link to original article here.) Most of the partnerships have now been phased out. However, while it was in operation, the partnerships effectively gave these partnership companies a blanket exemption from the usual privacy rules.

Hundreds of pages of Facebook documents were obtained by The New York Times. These  records, generated as far back as 2017 by the company’s internal system for tracking partnerships, provided the most complete picture yet of the social network’s data-sharing practices. The exchange was intended to benefit everyone. Facebook got more users boosting its advertising revenue, and partner companies acquired features that made their products more attractive. For example,the records show that  Facebook allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, and gave Netflix and Spotify the ability to read Facebook users’ private messages. Facebook users connected with friends across different devices and websites, reaping benefits for Facebook who had engineered extraordinary power over the personal data of more than 2.2 billon users. Prior to the GDPR, even in Europe, this power was exercised with a shameless lack of transparency and a dearth of substantive oversight.

Other investigations.

The latest grand jury inquiry comes amidst the backdrop of the Cambridge Analytica scandal where the political consulting company had improperly obtained the Facebook data of 87 million users and used the data to build tools that helped Trump’s campaign in 2016. This is part of an ongoing investigation by the Justice Department’s securities fraud unit. All along, Facebook’s position was that they had been misled by Cambridge Analytica, and had believed that the data were only being used for academic purposes. “In the furore that followed, Facebook’s leaders said that the kind of access exploited by Cambridge in 2014 was cut off by the next year, when Facebook prohibited developers from collecting information from users’ friends. But the company officials did not disclose that Facebook had exempted the makers of cell phones, tablets and other hardware from such restrictions” (NY Times, June 3, 2018.) https://www.nytimes.com/interactive/2018/06/03/technology/facebook-device-partners-users-friends-data.html?module=inline   Neverthless, some of the fine print on a quiz app that collected the data, which Facebook deleted way back in 2005, was evidence that the company knew about the potential for the data to be used commercially.

Facebook’s Wheeling and Dealing.

The pervasive nature of some of the deals that Facebook initiated become clearer when, for example, the evidence shows that  one deal empowered Microsoft’s Bing search engine to map out the friends of virtually all Facebook users without their explicit consent, and allowed Amazon to obtain users’ names and contact information through their friends. Apple was able to conceal from Facebook users any indicators that the company’s devices were even asking for data.  (NY Times, March 13, 2019). See link at top of blog). This demonstrates the covert level involved. An investigation that is still in progress gives an insight into the business and corporate psyche of the business model that Facebook is proud to espouse.  Facebook entered a data sharing consent agreement with the Federal Trade Commission in 2011. In this consent agreement, Facebook were barred from sharing user data without explicit consent.However, agreements which Facebook concluded, benefited more than 150 companies — most of them tech businesses, including online retailers and entertainment sites, but also automakers and media organizations. Their applications sought the data of hundreds of millions of people a month. The deals, the oldest of which date to 2010, were all active in 2017. Some were still in effect in late 2018 (NY Times, Dec. 18, 2018).

The Spin.

Facebook’s spin on it was that the companies they entered into agreements with were, ‘extensions of itself’ and not subject to the specific data sharing rules. After all, one can’t really share a secret with oneself!  The service providers were just partners that allowed users to interact with their Facebook friends. Facebook dismissed the notion that they stood to gain substantially from the arrangements, despite admitting that they had not really policed the activities of their partners. Data privacy experts are rightly sceptical that a regulator, as thorough as the Federal Trade Commission, would view these businesses as being ‘alike’. With its experience, the FTC is hardly going to consider businesses as varied as device makers, retailers and search companies as being alike, to such an extent as to be exempt from the regulation. It seems this was Facebook’s opinion. But former chief technologist at the Federal Trade Commission, Ashkan Soltani, saw it as nothing more than a ruse, stating, “The only common theme is that they are partnerships that would benefit the company (Facebook) in terms of development or growth into an area that they otherwise could not get access to”.

Concluding…

In summary, Facebook has trouble on quite a few fronts: the original Cambridge Analytica investigation has now involved Facebook being investigated by both the FBI and the securities fraud unit of the Justice Department; the Federal Trade Commission is close to finalising its investigation into possible violation of the consent agreement ( multi-billion $ fines are anticipated) ; the Justice Department and the Securities and Exchange Commission are investigating Facebook and the U.S Attorney’s Office for the Eastern District of New York is heading a criminal investigation. (Remember, at the moment we are not talking about Europe and GDPR!!) The signs are ominous and expect to hear more from us, and others, on Facebook’s  problems in the near future.

On March 19, Rep David Cicilline (D-RI), head of the House of Representatives Judiciary Committee called for the FTC to investigate Facebook on the grounds of anti-monopoly law. https://www.theverge.com/2019/3/19/18272605/facebook-ftc-investigation-congress-republican-david-cicilline

Patrick Rowland, GDPRXpert.ie

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit www.gdprxpert.ie

 

 

 

DPC Issues Annual Report

The  DPC’s first annual report since the GDPR has just been released. It is  not surprising to observers of developments in the data protection field that at the outset the report remarks , “it is the rise in the number of complaints and queries to data protection authorities across the EU since 25 May 2018 that demonstrates a new level of mobilisation to action on the part of individuals to tackle what they see as misuse or failure to adequately explain what is being done with their data”.(DPC Report, 2018) It is fair to say that pre-GDPR there was very much hype and alarm and this amplified the closer it came to D-Day, May 25th, 2018. Things have changed somewhat since then and if, “ we understand something about the GDPR, it is this: it will be a process of dialogue that lasts many years and the dialogue will need to shift and change with technology, context, learning from evidence (including emerging case law) and evolving societal norms.”(DPC Report, 2018)

We spoke in an earlier blog, and we allude to it on this website, about some misinformation and disinformation that unfortunately increased the sense of alarm and panic pre-GDPR. After May 25th there was more.  It seems the hairdresser who cited GDPR as the reason she could not give her customer details of the hair dye she was using in her customer’s hair is the favourite GDPR myth within the office of the DPC. By the way, the hairdresser’s customer was leaving to go to another hairdresser and wanted to be able to tell the new hairdresser what colour went in her hair, but we can be sure that this had nothing to do with the hairdresser’s response!

Some Facts  From the Report.

  • 2,864 complaints, of these the largest single category was in the category ‘Access Rights’ with 977 complaints, or a little over 34%of the total.
  •  1,928 were complaints under GDPR and of these 868 had been concluded.
  •  total of 3,452 data breaches recorded with the largest single category being ‘Unauthorised Disclosures’ and 38 breaches related to 11 multi-national technology companies.
  •  almost 31,000 contacts were made to the Information and     Assessment unit within the DPC.
  • 15 statutory inquiries (investigations) were opened in relation to the compliance of multinational companies with GDPR.
  • 16 requests  –formal and voluntary- for mutual assistance from other EU data protection authorities.
  • 31 own volition inquiries under the Data Protection Act 2018 into the surveillance of citizens by the state sector, for law enforcement purposes, through the use of technologies such as CCTV, body-worn cameras, automatic number plate recognition, drones and other technologies. These inquiries are conducted by the Special Investigation Unit. This same unit continued its work in relation to the special investigation into the Public Services Card that we have featured on our website recently.
  • 950 general consultations were received, excluding the consultations with multinational technology companies.
  •  900 data protection officer notifications.

In late 2018, the DPC established an advanced technology evaluation and assessment unit (the Technology Leadership Unit – TLU) with the objective of supporting and maximising the effectiveness of the DPC’s supervision and enforcement teams in assessing risks relating to the dynamics of complex systems and technology.

So it has been a busy and productive time for the office of the DPC and they even got time to speak at over 110 events including conferences, seminars and presentations. Late last year the DPC commenced a significant project to develop a new five-year DPC regulatory strategy that will include extensive external consultation during 2019.   It has to be remembered that The DPC received complaints under two substantive parallel legal frameworks during this period:

  • complaints and potential infringements that related to, or occurred,                 before 25 May 2018, must be handled by the DPC under the framework    of the Data Protection Acts 1988 and 2003;
  • and in addition and separately, complaints received by the DPC relating to the period from 25 May 2018 must be dealt with by the DPC under the new EU legal framework of the GDPR and Law Enforcement Directive and the provisions of the Data Protection Act 2018, which give further effect to, or transpose those laws into the laws of Ireland as a Member State of the EU.

The DPC took an active part in the Global Privacy Enforcement Network (GPEN) 6th annual privacy sweep. Data protection authorities from around the world participated and the theme in 2018 was privacy accountability. Accountability is a central element of GDPR. It is a concept that, “requires organisations to take necessary steps to implement applicable data protection rules and regulations, and to be able to demonstrate how these have been incorporated into their own internal privacy programs” (DPC Report 2018).  In the last sweep GPEN aimed to assess how well organisations have implemented accountability into their own internal privacy programmes and policies. One goal was to establish a sort of baseline of an organisation’s compliance with data protection. This was the brief for the DPC, as their input was targeted at randomly selected organisations in Ireland. 30 organisations across a range of sectors completed a suite of pre-set questions relating to privacy accountability. Because the sweep was done in the last quarter of 2018 only preliminary or provisional results are available to date of report. Preliminary results include the following:

  • 86% of organisations have a contact listed for a DPO on their website
  • 75% appear to have adequate data breach policies in place
  • All organisations seem to have some kind of data protection training for staff However, only 38% could provide evidence of training for all staff including new entrants and refresher training
  • In most cases organisations appear to undertake data protection monitoring/self- assessment but not to a sufficiently high level. In this category, 3 out of 29 scored ‘poor’ , while 13 could only reach a ‘satisfactory’ level
  • 1/3 of organisations were unable to show any documented processes in place to assess risks associated with new technology and products
  • 30% of organisations failed to show they had an adequate inventory of personal data, while close to  50% failed to keep a record of data flows

These again are preliminary, and the full results will be more instructive. It is to be emphasised that 30 organisations represent a small sample size.  Nevertheless, there seems to be large deficiencies in staff training and data protection monitoring/ self- assessment. Many issues will be more fully addressed in the coming months when the results of the ‘sweep’ will be available.

 

 

 

Latest News