PSC Investigation Findings By the DPC.

Data protection consultants welcome the findings from the investigation by the DPC into the  Public Services Card. In a blog post back in February expert data protection consultants GDPRxpert  rightly predicted the main conclusions resulting from the recent investigation by the office of the DPC into the legitimacy of the Public Services Card. At the time we highlighted some of the concerns that the DPC was likely to focus on in the continuing contentious debate. The full report has not yet been made available by the Dept.of Employment Affairs and Social Protection (DEASP). However, the DPC has published some initial findings.


Some Backdrop

 As we stated in the earlier blog post, “Most of you will remember some controversy about this card at the time it was introduced, and it initially focused on one theory in relation to its introduction. For many, it represented no more than the introduction of an identity card by stealth. The government vehemently denied this, and different Ministers for Social Protection (Burton, Varadkar, and Doherty) regularly appeared in the media to explain and defend the purposes behind its introduction and certify its bona fides. It was just a convenient card with no other purposes than to cut down on benefit fraud and streamline operations. Everything now should work more cost- effectively and taxpayer money would be saved.” See the GDPRxpert blog post, “Public Services Card Debate Resumes” at .


Main Finding


Our earliest key finding was that the introduction of the card did have a solid lawful basis. It was underpinned by legislation. (We detail the sections under the Social Welfare Consolidation Act 2005 in our earlier blog.)  This concurs with the DPC finding. The introduction and use of the card in relation to accessing social services from the Dept of Social Protection was legitimate. That is where its lawful basis ended. What must be borne in mind is that the report was compiled in the context of events prior to the introduction of the GDPR. From a practical perspective, and because GDPR cannot be applied retrospectively, the report was based on data protection laws in force at the time. Here we refer to the Data Protection Acts 1988 and 2003 (‘the acts’). There is much in common between ‘the acts’ and the GDPR, but the GDPR has higher standards of transparency, accountability, and enforcement.


It was partly these lower general standards, but particularly the lower standard of transparency (than under GDPR) that revealed systemic illegitimacy. Retention of supporting documentation that was demanded in support of an application was excessive. Central to this criticism was the general lack of any definitive retention period policy but instead a ‘blanket and indefinite retention of underlying documents and information provided by persons applying for a PSC’. This contravened Section 2(1)(c)(iv) of the Data Protection Acts, 1988 and 2003 because such data was being retained for periods longer than is necessary for the purposes for which it was collected. Any information provided by the Department to the public about the processing of their personal data in connection with the issuing of PSCs was not adequate. One has only to look at the information now required under Arts. 12, 13 &14 GDPR to see the depth of the lower standards under ‘the acts’.

 Other Bodies

While the Dept of Employment Affairs and Social Protection (DEASP) had at least a lawful basis for the card, other departments and public bodies did not. They just began asking for it in the normal course of business. It is more accurate to say they demanded it. They had absolutely no lawful basis for this type of demand.  Both the Passport Office and the National Driving Licence Service demanded the PSC before allowing any applications through their offices. It is those other bodies and departments that lack a lawful basis entirely, and now they must cease the practice of  demanding the PSC. There will be much discussion, especially in government circles, over the next few weeks regarding the future of the PSC.  Many data protection professionals, included, have formed an initial consensus that the card is likely to continue in use, but only in connection with services from DEASP.

Some Immediate Measures.


The DEASP, “will be required to complete the implementation of two specific measures within a period of 21 days:

  •  It will be required to stop all processing of personal data carried out in connection with the issuing of PSCs, where a PSC is being issued solely for the purpose of a transaction between a member of the public and a specified public body (i.e. a public body other than the Department itself). The corollary of this finding is that bodies other than DEASP cannot insist that a person who does not already hold a PSC must obtain one as a pre-condition of accessing public services provided by that body.
  • The Department will be required to contact those public bodies who require the production of a PSC as a pre-condition of entering into transactions with individual members of the public, to notify them that, going forward, the Department will not be in a position to issue PSCs to any member of the public who wishes to enter a transaction with (or obtain a public service from) any such public body”. (From DPC statement)


We will return to the topic as things develop and add to this (shorter than normal) blog post very soon.  Prompt publication of the entire report would be beneficial to all parties.

Patrick Rowland,

GDPRXpert, GDPR & data protection consultants, with bases in Carlow/Kilkenny and Mayo, offer a nationwide service.

P.S.  3 Sept. 2019.     The deadline passed for the Department and no report was forthcoming. Indeed,  things have altered to the extent that it is unlikely the Dept. will release the report in the foreseeable future. Most data protection consultants, such as GDPRXpert agree with the findings by the DPC. However, it seems the Government is to challenge the findings of the DPC in court having taken legal advice from the Attorney General, and externally. See I.T. article on the latest.  So the saga continues. As they say, ‘watch this space’.

P.S. No. 2     Somewhat surprisingly, just a couple of days after this postscript the Govt. did publish the report of the DPC. See Irish Times article, ” The Irish Times view on the Government defiance of the DPC”, Sept. 19, 2019. Text following is from that article.

Key findings include a decision that the card cannot be required to obtain services from other departments because no lawful basis exists for such use. It cites numerous examples of the “mission creep” by which the card transformed from its original intention as a chip-and-pin verification device for social welfare services, into a required form of identity for seemingly random purposes, such as sitting a driving test, obtaining a passport, or appealing school transport decisions.

The report states that such examples illustrate “obvious and significant deficits in terms of logic and consistency” for when the card is required.

While such findings had been released earlier in summary form by the DPC, the full report adds significant heft and leaves little legal wriggle room for the Department. Yet the Government intends to defend the card, in direct defiance of a national regulator, with both the Minister and Taoiseach Leo Varadkar suggesting that the DPC should have met with the Department to “discuss” the findings. Thoughts on GDPR a Year On.

The GDPR is now over 14 months in operation. This blog post offers some thoughts on GDPR a year on. It is still a little early to have any kind of truly substantive analysis of the effectiveness of the Regulation to date. A difficulty that immediately surfaces is how to quantify its effectiveness. What is an appropriate measure or barometer of its effectiveness? Fines speak to enforcement, but without the specific details, little can be extrapolated, even in a general context.  If the level of fines is taken as a metric, does it mean that as fines rise the regulation is simply being enforced more? Do the increasing fines mean that the overall level of compliance is dropping?  It could be that fines are increasing because more organisations are, and in some cases choosing to be, non- compliant.

In any sphere of regulation there will always be a non- compliant percentage. Therefore, it does not follow that there is a direct relationship between fines and non- compliance. In other words, it does not mean that as fines go up, the level of non- compliance is also going up.  Fines are always going to be imposed as a deterrent, even in situations of high percentages of compliance. Some organisations that are being fined may be repeatedly and stubbornly non- compliant.  Any increase in reported data breaches that leads to the opening of an investigation may conclude with the imposition of fines.

Are regulators to go on ‘fishing expeditions’ to fine some high profile organisations? There has always been talk of the DPC and other regulators planning to go for some ‘low hanging fruit’ in the early days of the GDPR. Most experts put financial services in this category. No evidence of this has been found so far. In the case of undertakings, fines can reach 2% or 4% of global turnover, depending on the infringement.  So far, companies have been spared the harshest penalties that can be meted out. This is likely to change, according to the regulators across the EU.

In the lead up to the Regulation’s introduction, the general strategy of Supervisory Authorities (SAs) was to educate the public and organisations. It was the consensus amongst SAs that education was the best mechanism for regulation preparedness. Ignorance of the law (regulation) is never an excuse and so an EU wide focus was on promoting education on the regulation.  ‘GDPR Awareness-building’ was the process chosen to direct the education mechanism. The ultimate goal was to foster and nurture compliance with the new regulation, and develop a culture of compliance over time.  The general public was to be made aware of rights, and organisations were to be made aware of obligations and responsibilities.  A thorough understanding of the core principles of transparency and accountability was highlighted as a mandatory requirement for competent data controllers and processors.

What has been happening since the introduction of GDPR?

In a previous blog, we examined some stats. from the DPC’s first annual report, post-GDPR. Most noteworthy was the number of data breaches adding to a total of 3,452. Perhaps this should not be surprising. What was surprising was that out of this total number of breaches, the largest category was ‘Unauthorised Disclosures’, but the fines did not seem to follow.

The French regulator, Mathias Moulin, emphasised that the first year of GDPR should be considered a ‘transition year’. Transition year or not, early numbers for the GDPR make clear that the policy has been a success as a breach notification law, but largely a failure when it comes to imposing fines on companies that fail to adequately protect their customers’ data. Stephen Eckersley, the head of enforcement at the U.K. Information Commissioner’s Office, said the U.K. had seen a “massive increase” in reports of data breaches since the GDPR’s implementation. In June 2018, companies self-reported 1,700 data breaches, and Eckersley estimated that the total will be around 36,000 breaches reported in 2019, a significant increase from the previous annual reporting rate of between 18,000 and 20,000 breaches”. See,

Some Stats

Other reports give more information on GDPR to date . There were 89,000 data breaches recorded of which 37% are still pending investigation or penalties. 65,000 data breaches were reported to the European Data Protection Board. In the first 8 months of the GDPR nearly 60,000 breaches were reported across Europe. (Law firm DLA Piper).  Google was hit with a €50million fine for not making it clear to users how it was harvesting data from its own search engine, YouTube and Google Maps. This penalty was the largest and was issued by the French Data Protection Authority (CNIL) in January against Google. It was related to a lack of transparency, inadequate information and a lack of valid consent regarding the personalization of the ads.

French authorities had received complaints abourt  Google’s handling of personal data. CNIL, the relevant authority, found that the structure of Google’s privacy policy and terms and conditions were too complicated for users, and the use of pre-ticked boxes as a consent mechanism did not establish a legal basis for data processing to deliver targeting advertising. It is helpful for a better understanding of the fines regime to look at the broader context of the  Google fine more closely.


Some Fines

The French regulator cited Google’s failure to centralize essential information on one page, and its process requiring users to go through “up to five or six actions.”Google’s penalty accounts for nearly 90% of the total value of fines levied to date. But it had the potential to be much larger. In 2018 Google reported nearly $136.2 billion in revenues. Therefore, the 50 million euro fine represented approximately .04% of revenue, far from the 4% potential penalty. (Above 2 paras. from Compared to the Google fine, other fines levied by European national data protection authorities (DPAs) have been considerably smaller. For example, in March 2019 the Polish DPA announced that it had fined a company approximately 219,000 euros for failure to inform six million individuals that their personal data were being processed. Also, in March 2019, the Danish DPA fined a company approximately 161,000 euros for holding on to personal data longer than allowed under GDPR. (from the  same source directly above)

Outside of the Google fine, the penalties thus far have been so small that many are anxiously awaiting the next whopper of a fine. Irish and UK authorities have hinted that a large fine is coming. (Todd Ehret, Thomas Reuter’s, May 22, 2019). GDPRXpert has previously reported on Facebook’s difficulties with the office of the DPC here in Ireland, and ongoing investigations seem likely to conclude with large fines being meted out. See also GDPRXpert’s blog post,

European Data Protection Board Survey

It is somewhat surprising that, despite the awareness building campaign by all SAs, a European Data Protection Board survey found in May 2019 that;

  • only 67% of people across Europe had heard of GDPR;
  • 36% claimed to be ‘well aware’ of what GDPR entails;
  • 57% of EU citizens polled indicated they are aware of the existence of a public authority in their country responsible for their data protection rights


This result represents an increase of about 20% from a 2015 Eurobarometer. It is a disappointment that it is not higher, considering the cost, the scale and the scope of the campaigns in all Member States to educate citizens, prior to GDPR. Many people question whether the message of GDPR was communicated in a properly measured manner, before the introduction of the legislation.

 ‘Privacy Sweep’.

Some stats from the 6th annual  Privacy Sweep conducted by the Global Privacy Enforcement Network (GPEN)  reaffirm the veracity of the results above. Data protection consultants, have included elements of this in a published blog post.  Data protection authorities from around the world participated, and in the last sweep GPEN aimed to assess how well organisations have implemented accountability into their own internal privacy programmes and policies. One goal was to establish a sort of baseline of an organisation’s compliance with data protection. This was the brief for the DPC, as their input was targeted at randomly selected organisations in Ireland. 30 organisations across a range of sectors completed a suite of pre-set questions relating to privacy accountability. Because the sweep was done in the last quarter of 2018 only preliminary or provisional results are available to date of report (DPC Report, 2018).



Some Stats from ‘Privacy Sweep

Preliminary results include the following:

  • 75% appear to have adequate data breach policies in place;
  • All organisations seem to have some kind of data protection training for staff;
  • However, only 38% could provide evidence of training for all staff including new entrants and refresher training;
  • In most cases, organisations appear to undertake data protection monitoring/self -assessment but not to a sufficiently high level. In this category, 3 out of 29 scored ‘poor’, while 13 could only reach a ‘satisfactory’ level;
  • 1/3 of organisations were unable to show any documented processes in place to assess risks associated with new technology and products;
  • 30% of organisations failed to show they had an adequate inventory of personal data, while close to 50% failed to keep a record of data flows.


Is There Still a Wait and See Approach?

Businesses and organisations have reacted to the GDPR in their own way, depending on what they view as their individual exposure. There is no doubting the cost-benefit analyses done by some companies to quantify the potential fines v. the cost of compliance measures. This is especially so for companies that have no presence in the EU but fall under the GDPR by virtue of Art. 3. Most companies are taking a proactive approach to dealing with the new realities of personal data protection. Possibly, because many of the fines to date have been nominal compared to what they could have been, there are some companies that are waiting to see what the supervisory authorities in each EU member country are going to do. The prevailing wisdom is that fines will be going up as regulatory actions play out.

At the moment businesses and organisations are looking at the possibility of larger fines being imposed. Many are waiting to see if there are any trends emerging or are particular types of businesses being targeted.  So far, it seems only the very big companies have been targeted by the office of the DPC here in Ireland. Anecdotal evidence suggests smaller businesses do not see themselves as being on the DPC radar. The perception is that the DPC has too much on their plate and ‘bigger fish to fry’. Smaller businesses may very well fly under the radar for a short while longer.  However, one can assume, partly because of the prolonged bedding in period for the regulation, by the time the DPC gets around to some smaller fish, a high level of compliance will be expected. At that stage, no excuses are going to be accepted in any defence of non- compliance.

What is Likely to Happen in the Next Phase?

In the DPC Annual Report 2018,  there were  2,864 complaints, of these the largest single category was in the category ‘Access Rights’ with 977 complaints, or a little over 34%of the total. Here is a warning flag for businesses. All of these complaints have the potential to trigger an associated investigation by the DPC into an organisation’s compliance with the GDPR.  At least initially, it is believed supervisory authorities will take a more cautious approach to levying the harshest penalties, says Peter Milla, the data protection officer at Cint, a provider of consumer data sets to market researchers around the world, with global corporate headquarters in London.

“What’s going to happen is the regulators are going to come in to see if you have a compliance program but they’re going to be very lenient,” he says. “They’re obviously not going to put small companies out of business because there’s a political component here but they will fine. They’re going to be commercially reasonable. The Germans are probably going to be the harshest, Milla says.

One thing that has been forecasted is that there will be greater enforcement. Regulators across the EU have significantly increased staffing levels and it is logical to expect greater enforcement as a result.   Another expected development is that increasingly educated and ‘GDPR conscious’ consumers will drive data protection and privacy by design. The same consumers will be attracted to businesses and organisations that are seen to respect their data protection and privacy rights. A lack of either a Privacy Notice or a Privacy Statement is a clear indicator of an organisation’s clear disregard for the core principle of transparency. What is such an organisation doing with your personal data? That is anyone’s guess.

Patrick Rowland,,  with bases in Carlow/ Kilkenny and Mayo, offer a nationwide GDPR and data protection consultancy service.

Visit to learn more


Transfers of Personal Data outside the EU/EEA

In the most recent blog post we attempted to capture the context of some of the channels of transfers of data outside the EU/EEA. The Schrems case provided some of this, by its scrutiny of the Standard Contractual Clause mechanism. Since the introduction of the GDPR the channels of transfer of personal data to a third country or international organisation have undergone changes.

Transfers of personal data to third countries or international organisations.   

Following the inception of the GDPR the law on transfers of personal data to third countries or international organisations (‘transfers’) is more settled. A caveat is that the exact interpretation of express terms in the GDPR that relate to transfers may come before the Court of Justice for ultimate clarification.

Art. 44 GDPR provides that transfers may only take place if subject to the other provisions in the regulation, and the conditions laid down in Chapter V are complied with by the controller and processor. A plethora of conditions is laid out in Chapter V. These conditions can be grouped as transfers subject to:

Adequacy Decisions;

Appropriate safeguards; or

Specific derogations.


Adequacy Decisions

Art.45 allows transfers where the European Commission has decided that the third country or international organisation ensures an adequate level of protection. Under this scenario, no specific authorisation is required. In practice, this confers a broad discretion on the European Commission in assessing adequacy. This has the potential to be viewed subjectively and politically influenced. It was the discretion in declaring an adequate level of protection existed that led to Schrems ( Case C-362/14, 6 Oct. 2015) ending up before the CJEU. As a means to counter balance the discretion of the Commission, Art.45(2) sets out three elements that the Commission must ‘in particular’ take into account when assessing the adequacy of the data protection in the third country. A list of countries with an adequacy decision is found here.


Elements to be taken into account to assess Adequacy

  • ‘the rule of law, respect for human rights and fundamental freedoms…’ Legislation, both general and sectoral, is examined. Are there adequate protections available when assessed   in the light of legislation concerning public security, defence, national security and criminal law? How about access of public authorities to personal data and the implementation of legislation above? What about data protection rules, professional rules, security measures and rules for onward transfer of personal data to another third country? Can data subjects gain effective administrative and judicial redress where they have complaints about how their data are being transferred?
  • ‘the existence and effective functioning of one or more independent supervisory authorities in the third country…’ The Commission should expect to see a supervisory authority with responsibility for ensuring and enforcing compliance with data protection rules, including adequate enforcement powers. It is not enough to have responsibility for enforcement, but it must have the powers to deliver on enforcement. Toothless tigers are not wanted.
  • ‘the international commitments the third country or international organisation has entered into..’ Something like this can act as an accurate gauge as to the value placed on international norms and rules. Part of this element of assessment can include scrutiny of international obligations the third country may have, as a result of some legally binding convention or instrument. Does the third country or international organisation participate in multilateral or regional systems, especially in the data protection sphere?

 The Goal

In essence, the goal is to have similar, if not identical, means of protection of personal data operating in the third country as is available to data subjects in the EU/EEA. As noted in the Schrems case, there must be an appropriate balance struck between the powers assigned to authorities in a third country and the protections provided for the persons whose personal data is being transferred. If the Commission is satisfied with the integrity and substance of data protection in the third country, it may then issue an Adequacy Decision. Any Adequacy Decision must be monitored and reviewed over time (Art.45 (4)) and can also be repealed, amended or suspended (Art. 45(5)).


Transfers Subject to Appropriate Safeguards

In the absence of an Adequacy Decision a controller or processor may transfer personal data to a third country or international organisation only where;

  • the controller or processor has provided appropriate safeguards; and
  • on condition that enforceable data subject rights and effective legal remedies for data subjects are available.

These appropriate safeguards can be provided in a number of ways and some need no specific authorisation from the Supervisory Authority (SA). Art. 46 (2) sets down the list of those not needing SA authorisation:

  • a legally binding and enforceable agreement between public authorities or bodies;
  • binding corporate rules in accordance with Art.47 ( more below);
  • standard data clauses adopted by the Commission ( in accordance with an examination procedure laid out in Art. 92(3));
  • standard data protection clauses adopted by the SA and approved by the Commission;
  • an approved code of conduct pursuant to Art. 40 together with binding and enforceable commitments of the controller or processor in the third country to apply appropriate safeguards;
  • an approved certification mechanism pursuant to Art.42  together with the same binding and enforceable commitments as above.

Of those listed above the most common mechanisms are Binding Corporate Rules (BCRs) and Standard Data Clauses. BCRs are “personal data protection policies which are adhered to by a controller or processor established on the territory of a Member State for transfers or a set of transfers of personal data to a controller or processor in one or more third countries within a group of undertakings, or group of enterprises engaged in a joint economic activity” (Art.4 (19)).

Recital 101 advises that a group of undertakings engaged in joint economic activity should be able to make use of BCRs for international transfers from the Union to organisations within the same group, provided the BCRs contain all essential principles and enforceable rights to ensure appropriate safeguards for the transfers of personal data. Competent SAs may adopt  BCRs, but the Commission itself may specify the exact  format and procedures for the exchange of information between the controllers, processors and SAs for those BCRs. Otherwise it is a matter for the SA to approve the BCRs.

Art.47 (2) sets some pre-conditions on any approval of the BCRs. First, they must be legally binding, and apply to and be enforced by every member of the group of undertakings engaged in the joint economic activity. Second, they must expressly confer enforceable rights on data subjects with regard to the processing of their personal data. Third, they must fulfil the requirements set out in GDPR Art. 47(2).


The Content of the Binding Corporate Rules

This same Art.47(2) lays down a comprehensive list of specific requirements for the content of the BCRs. It is not within the scope of this blog to enumerate all these requirements but they should be examined carefully in the text of Art.47(2).  There is no hierarchy of requirements but some on their face seem more important than others. A detailed analysis of the requirements for Binding Corporate Rules is laid out in this Ar. 29 Working Party document . It is a very comprehensive examination of the requirements and an excellent reference to satisfy any query.

Some of the Requirements

To be valid and acceptable the BCR must contain the structure and contact details of the group of undertakings/enterprises engaged in the joint economic activity and its members. All data transfers or sets of transfers, including the categories of personal data, the type of processing and its purposes, the type of data subjects and the identification of any third countries, must be clearly enumerated as part of the contents of the BCR.  Data protection principles are applicable and the rights of data subjects are to be expressly recognised including a right to obtain redress and, where appropriate, compensation for a breach of the BCR. Controllers or processors must accept liability for any beaches of the BCR by any member not established in the Union. Other requirements are laid out in Art. 47(2).

Standard Data Protection Clauses

For many organisations these clauses are the most usual mechanism to transfer personal data to a third country or international organisation. These are more common than adequacy decisions but they represent a minimum standard for data protection and for this reason it is envisaged (See Recital 109) that controllers and processors will add additional safeguards. The clauses must contain the contractual obligations of the ‘Data Importer’ and the ‘Data Exporter’ and confirm the data protection rights of the individuals whose data are being transferred.  Individuals can directly enforce those rights against either of the parties.

Standard clauses have been issued under the old Directive and these remain valid. However, the European Commission has advised the European Data Protection Board (EDPB) that it is planning to update the Clauses for the new GDPR. The Commission has made available the sets of Standard Contractual Clauses issued up to now.

Safeguard Mechanisms Requiring Specific Authorisation.

 Other mechanisms allow for the transfer of personal data to a third country or international organisation but these need prior specific authorisation from the SA. Safeguards in these cases may be provided by a) contractual clauses between the controller or processor and the controller, processor or recipient of the personal data in the third country and b) provisions to be inserted into administrative arrangements between public authorities or bodies which include enforceable and effective data subject rights. The consistency mechanism referred to in Art.63 is to apply to such authorisations. For example, where the SA aims to authorise contractual clauses it shall communicate that draft decision to the Board (i.e., the EDPB).

Specific Derogations

Where there is neither an adequacy decision available under Art.45, nor appropriate safeguards pursuant to Art. 46, a transfer of personal data can still take place if one of the conditions set out in Art. 49 is fulfilled. These conditions include:  explicit consent of the data subject to the data transfer, having been informed of the possible risks; the transfer is necessary for performance of a contract between the data subject and the controller or the implementation of pre contractual measures taken at the data subject’s request ; transfer is necessary for the performance of a contract concluded in the interest of the data subject  between the controller and another natural or legal person ( the foregoing do not apply to activities carried out by public authorities in the exercise of their public powers) ;

Specific Derogations contd.

… the transfer is necessary for important reasons of public interest ;the transfer is necessary in order to protect the vital interests of the data subject or of other persons, where the data subject is physically or legally incapable of giving consent; the transfer is necessary for the establishment, exercise or defence of legal claims; the transfer is made from a register which in accordance with Union or Member State law is intended to provide information to the public and which is open to consultation either by the public in general or by any person who can demonstrate a legitimate interest. Apart only from explicit consent, in all other cases the transfer is dependent on the transfer being deemed ‘necessary’. In practice the conditions are strictly applied and strictly interpreted with the result that it is preferable to use some other mechanisms to transfer data to third countries or international organisations. There is one final option if all other mechanisms or conditions are not present or available.

‘Last Resort’ Transfers of Personal Data

Where a transfer cannot be based on an adequacy decision or  appropriate safeguards, including binding corporate rules,  and none  of the derogations for specific situations apply, a transfer of personal data may still take place only if:

  • the transfer is not repetitive;
  • the transfer concerns only a limited number of data subjects;
  • the transfer is necessary for the purposes of compelling legitimate interests pursued by the controller, provided they are not overridden by the interests or rights and freedoms of the data subject; and
  • the controller has assessed all the circumstances surrounding the data transfer and has on the basis of that assessment put in place suitable safeguards for the protection of personal data. In addition, the controller must inform the SA and the data subject of the transfer. Any compelling legitimate interest pursued must be communicated to the data subject, together with  all the information requirements of Arts. 13 and 14.

The Recitals regard the last basis as one to be relied on, ‘in residual cases where none of the other grounds for transfer are applicable…’( Recital 113).

In most cases personal data transfers to third countries or international organisations are routine and uncomplicated. A complicated part is   knowing whether those transfers are legally sound or not. The prudent route is to follow the text of Arts. 45-49 and be aware of changes, such as CJEU decisions. Should the UK leave the EU without an agreed deal then the UK will become a ‘third country’ for the purposes of the GDPR and data transfers. If there is a ‘no deal Brexit’, data transfers to the UK will have to follow one  of the routes described in this blog.

Patrick Rowland,


Schrems case drawing to a close?


So when is it permissible to transfer personal data to a third country or international organisations?  This is a question that has taken on new relevance. The long-running litigation by Austrian lawyer Max Schrems has moved another step towards a final resolution, following a decision in the Supreme Court on May 31st. It has once again brought the legality of transfers of personal data to 3rd countries or international organisations to the forefront of data protection discourse. (Link to Irish Times article here).  Although the Schrems litigation commenced under the old Directive rules, the GDPR is now in effect and represents the law in the area since May 2018.

A brief overview will place the most recent litigation within its relevant context. That relevant context is the transfer of personal data outside of the EU/EEA and to international organisations. A more specific context means it has to be viewed in the light of the Safe Harbour Agreement and Standard Contractual Clauses (SCCs). Back in Oct. 2017, Ms. Justice Caroline Costello gave judgment in the High Court, and in May 2018 made a referral to the Court of Justice of the European Union (CJEU) of issues to be determined by the Court. These issues related to transfers using SCCs as the transfer channel. Facebook did not want the referral to reach the CJEU and initiated an appeal grounded on procedural legal grounds. Facebook’s strategy was to question the process rather than the principles involved.


At its core was whether there was or is an actual right to appeal a referral to the CJEU.  In his judgment of Facebook’s appeal the Chief Justice, Mr. Frank Clarke, held that it is for the referring court, and that court alone, to decide to make a reference and whether to amend or withdraw that reference. He was satisfied it was only in limited circumstances, such as where the facts themselves were not sustainable on the evidence before the High court in accordance with Irish procedural law, that any aspect of the High Court judgment could be overturned. Facebook was criticising the ‘proper characterisation of the underlying facts’, not the facts themselves, he said.

Ms. Justice Costello had sought to have clarifications on issues that spoke to the validity of the data transfer channels known as Standard Contractual Clauses (SCC). She had 11 questions that she needed the CJEU to answer concerning a European Commission decision to approve the SCC’s in the first place. Whether or not the measures provided for under  Privacy Shield were comparable to the remedy available to EU citizens under Art.47 of the EU Charter for breach of data protection rights was one point raised by the DPC in the High Court case. Privacy Shield replaced the Safe Harbours Privacy Principles, elements of which formed the basis of complaint for Max Schrems in some of his litigation. For more information on Privacy Shield click here.

We have referred in previous blogs to the notion of the balancing of the data subjects’ rights where their data is being processed. In the context of rights and personal data processing, all rights are taken into account, not just data protection rights.  GDPR was not in effect at the time of the litigation commenced by Schrems and hence the reference to the EU Charter and, in particular, Arts. 7, 8 and 47. (Article 7 provides that “everyone has the right to respect for his or her private and family life, home and communications.” Article 8 states “everyone has the right to the protection of personal data concerning him or her,” and mandates that such data must be “processed fairly for specified purposes and on the basis of the person concerned or some other legitimate basis laid down by law.”

According to Article 7, “everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.” Article 8 further authorizes enforcement of the rules via independent authority. Article 47 guarantees a “right to an effective remedy before a tribunal” to “[e]veryone whose rights and freedoms [are] guaranteed by the law of the Union.” It also requires a “fair and public hearing within a reasonable time by an independent and impartial tribunal previously established by law.”)

The revelations by Edward Snowden in 2013 gave insights into the massive extent of the interception and surveillance of internet and telecommunications systems by the US National Security Agency. It was not just that these actions were disproportionate, but that they infringed upon the very right to privacy. At the time of the Snowden revelations, data transfers to the US were being governed by the so- called, ‘Safe Harbours Agreement’.Despite this agreement, Schrems had concerns about both Facebook’s transfer of his personal data to the US, and processing of those data by American authorities.

A position taken by the DPC was that once an adequacy decision (here, the Safe Harbours Agreement) had been issued, the office had no part in investigating a complaint. Safe Harbours itself stood as testament to the adequacy of the protection of transfers of personal data to the US. Mr. Justice Hogan in the High Court thought Schrems was objecting more ‘to the terms of the Safe Harbour regime itself’, than to the DPC’s application of it. (Schrems v DPC [2014] IEHC 310 (18 June 2014) Para.69). This is often referred to as Schrems No.1.

Another position taken by the DPC was that the complaint was essentially speculative and hypothetical in nature. Mr. Justice Hogan took the view that there was no need to establish that the applicant had even grounds to suspect such a breach had occurred. It was enough to believe the mere absence of controls might lead to a breach of the applicant’s rights. If the matter was solely governed by Irish law significant issues would have arisen under the constitutional right to privacy. Mr Justice Hogan referred the case to the CJEU partly on the basis that, ‘in reality, on that key issue Irish law has been pre-empted by general EU law in the area…’ (Schrems, as above, at paras. 78-80). In hindsight, this reference to the CJEU was the beginning of the end for the Safe Harbours agreement.

CJEU Case C-362/14 (6 Oct.2015)

It has to be borne in mind that the case before the Court dates back to Directive 95/46 days, pre-GDPR, that is. One definitive finding by the Court was that the DPC (or any National Supervisory Authority) when examining a claim concerning the compatibility of a Commission decision with the protection of the privacy rights and fundamental rights of an individual cannot declare the decision invalid themselves (of course, neither can the national courts). Where a national supervisory authority, such as the DPC, comes to the conclusion that the complaint is unfounded, the complainant must have, in accordance with Art. 47 of the EU Charter, access to judicial remedies enabling a challenge to be made before the national courts. The court must stay proceedings and make a reference to the CJEU for a preliminary ruling on validity, where the court is of the opinion that some grounds for invalidity are well founded. In addition ,the national courts themselves can raise issues of their own motion.

In the converse situation, where the Supervisory Authority (SA) is of the opinion that the objections of a person lodging a complaint are well-founded, then the SA must put forward those objections in order for a national court to adjudicate upon them. A reference to the CJEU for a preliminary ruling can be made where a national court shares the doubts as to the validity of a decision. The Court ultimately found the Safe Harbours agreement invalid, mainly because the Commission had not made, ‘any finding regarding the existence , in the United States, of rules adopted by the State intended to limit any interference with those rights  and without referring to the existence of effective legal protection against interference of that kind’. United States’ authorities were, ‘able to process the personal data transferred  …and process the data in a way incompatible, in particular, with the purposes for which they were transferred…data subjects had no administrative or judicial means of redress…’( at paragraph  90). Without appropriate safeguards in place, that mirror or match safeguards under EU law, there can be no adequacy.


Later on 20th Oct 2015, the proceedings were returned before the High Court and the decision of the CJEU was implemented by the making of an order setting aside the decision of the DPC not to investigate the original complaint of June 2013. The High Court then remitted the original complaint back to the DPC for investigation. Immediately following the High Court order Mr.Schrems re-formulated and resubmitted his complaint to take into account the fact that Safe Harbour had been struck down. Having considered the matter the DPC decided to proceed on the basis of the new formulation. During its investigation, the DPC established that Facebook, and many internet companies, continued to transfer personal data to the U.S. in large part by means of Standard Contractual Clauses (SCCs). These are pro forma agreements which have been approved by way of certain EU Commission decisions, as providing adequate data protection for the purpose of transferring personal data to third countries.

On 24 May 2016, the DPC issued a draft decision to Schrems and Facebook informing both that the preliminary decision was the complaint was well-founded but further submissions were invited from both parties. Three reasons were given by the DPC :

  1. a) A legal remedy compatible with Article 47 of the Charter is not available in the US to EU citizens whose data is transferred to the US where it may be at risk of being accessed and processed by US State agencies for national security purposes in a manner incompatible with Articles 7 and 8 of the Charter;

(b) The SCCs do not address the CJEU’s objections concerning the absence of an effective remedy compatible with the requirements of Article 47 of the Charter as outlined in its judgment of 6 October 2015, nor could they; and,

(c) The SCCs themselves are therefore considered likely to offend against Article 47 insofar as they purport to legitimise the transfer of the personal data of EU citizens to the US.

The DPC, therefore, commenced legal proceedings in the Irish High Court seeking a declaration as to the validity of the EU Commission decisions concerning SCCs and a preliminary reference to the CJEU on this issue. Both Facebook and Mr. Schrems were named, as the joining of these parties affords them an opportunity (but not an obligation) to fully participate if they so wish and to make submissions in the case. All of this brings us back to the High Court and the decision by Ms Justice Costello to make a reference to the CJEU. She had also refused to put a stay on the reference to the CJEU, but Facebook then took things to the Supreme Court As detailed earlier, Facebok’s appeal against the reference has been dismissed in the Supreme Court.

Soon it will be back to the CJEU. As it stands, it will be some time before we know whether the Standard Contractual Clauses at issue will hold up as legally sound channels of personal data transfer, in particular, to the United States. One can hypothesise about the interpretation the CJEU will favour, but whatever it is will have a bearing on future interpretation of the channels of transfer under the new GDPR regime.

In an upcoming blog, we will look through the lens of the GDPR to focus on the means by which personal data can now be legally transferred to third countries and international organisations. Future interpretations will be informed by the final decision of the CJEU on the Standard Contractual Clauses reference that is soon to be in that court.

Patrick Rowland,


Right to Rectification and Principle of Accuracy

The right to rectification and the right of access were (and still are) guaranteed under the Charter of Fundamental of the European Union. Art. 8(2), “Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified. “ The Charter has applied to the EU since the entry into force of the Lisbon Treaty on 1 December 2009 and so it predates the GDPR. It was Art.16 of The Treaty on the Functioning of the European Union (TFEU) which imposed the specific obligation on the EU legislature to actually make data protection rules, and it was this that eventually led to the GDPR.

Art.16 GDPR sets out in stronger and clearer language the right to rectification. It is a right that is wisely read in conjunction with the principle of accuracy under Art. 5(1) (d) of the GDPR.  As an individual data protection principle, the principle of accuracy stands alone only in the text itself. It is intertwined with all the other principles to form a greater whole. Article 5(1)(d) states that personal data shall be, “accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are process, are erased or rectified without delay  ( ‘accuracy’)”.  Let us remember that Art. 15, the right of access, is often the starting point for other requests. For example, it is this same article that facilitates other rights, because it gives the data subject the right to obtain confirmation on whether his or her data are being processed in the first place. If in fact personal data are being processed, then the data subject can have inaccurate data rectified or have incomplete data made complete. Sometimes this is best achieved, and facilitated, by means of a short supplementary statement. Therefore, Art.15 can work in two ways: 1) by completing incomplete data; 2) by rectifying inaccurate data. In the case of Max Schrems,  (  Case C-362/14) ( 16 Oct.2015), one of the defects identified by the CJEU was that there “was no means of enabling the data concerning the data subject to be accessed and, as the case may be, rectified or erased”( At para.90). The starting point is again, the knowledge that personal data are being processed in the first place.

Under the old Data Protection Acts (‘the acts) many complaints were received and processed concerning the right to rectification and the right to erasure of inaccurate data. Not many cases have come up for scrutiny since the introduction of GDPR, but future case types will likely mirror some from the pre-GDPR days. Sometimes looking back can act as an accurate guide to what may occur in the future. Below are some interesting cases that contain scenarios and circumstances that could resurface. They will give a taste of the substance of the right.

EMI Records v The Data Protection Commissioner [2012] IEHC 264

This case was a leading case on the processing of inaccurate personal data and went to court under the old acts.

Brief Facts:  Eircom a telecommunications provider had been operating a scheme whereby recording companies were detecting on the internet those who were uploading their copyrighted music and video. The recording company passed on the information consisting of copyright title, time and temporary IPM address to Eircom. Eircom then wrote to their subscribers reminding them that downloading copyrighted material was in breach of their subscriber contract. Those who continued illegally downloading would have to find a new telecommunications provider as Eircom would no longer provide internet service.

In October 2010 Eircom forgot to change the clocks to reflect new wintertime. As a result, it wrongfully identified some people as illegally downloading, when they were not. The DPC issued an enforcement notice at the time directing that it cease its activities. The case gives a sense of what might be viewed as inaccurate processing of personal data. In this instance , the practice ceased. This was despite the fact the judge in the High Court found the enforcement notice from the DPC contained ‘no reasons whatsoever’, and ruled it invalid.

Smeaton v Equifax plc [2013] EWCA Civ 108 (20 Feb. 2013)

This case throws up some interesting issues and perspectives on the concept of accuracy. The defendant was a UK credit registry whose database indicated the plaintiff was subject to a bankruptcy order. In fact, the order had been made originally at first instance, but stayed upon appeal, and then rescinded. The plaintiff claimed for losses and damages resulting from the inaccuracy. Initially, the claim was successful but overturned on appeal.

What makes the case unusual is that the plaintiff had acted as a lay litigant in challenging the bankruptcy order. Generally, in cases such as this a solicitor would represent the litigant and inform the credit registry the client had been discharged. Smeaton’s argument was that Equifax should have been aware of the discharge, notwithstanding his self-representation. Again, it has to be stressed that this was an unusual case and decided on its own particular facts. The Court recognised that the old English Data Protection Act 1988 did,   “not impose an absolute and unqualified obligation on Credit Reporting Agencies to ensure the entire accuracy of the data they maintain. Questions of reasonableness arise”.

An important consideration when questioning certain rights, if not all rights, under the GDPR is to realise that the extent of a right, and the degree to which it may be vindicated, may in the more contentious case go all the way to the final arbiter, the CJEU. Proportionality and the balancing of rights are paramount under EU law. It is only when it reaches this forum that jurisprudential reasoning truly asserts itself. The CJEU will define the parameters and specific meanings of words in the legislative text.    Even though Smeaton v Equifax goes back to 2013, it is still good authority for the proposition that controllers are not under an absolute duty to ensure the accuracy of their data.

There have been 30 cases cited in the first annual report of the DPC since the introduction of GDPR in relation to the right to rectification in the first annual report of the DPC since the introduction of the GDPR, available here.

Case Study 3/2018.

Again this is one of the recent cases from the DPC Annual  Report 2018 that highlights the close relationship between the accuracy principle and the Right to Rectification. The DPC received a complaint from a Ryanair customer whose webchat details were erroneously sent to another Ryanair web chat user. Of course, issues of integrity and confidentiality come into play also. On the date in question, the data processor received requests from four Ryanair customers for transcripts of their web-chats, all of which were processed by the same agent. However, the agent did not correctly change the recipient email address when sending each transcript so that they were sent to the wrong recipients. Included among the recommendations was one that recipient e-mails should be changed to ensure accuracy and using the autofill function in their software with extreme caution. Ryanair subsequently informed the DPC that their autofill function in their live web chat system had been disabled by their data processor.

Perhaps it is due to the nature of the business, and a strong desire for expediency, that credit reference agencies have historically been disproportionately involved in breaches, compared to other businesses. We can look at a few of the more interesting ones.

Case Study 2/1997

This complaint was received concerning the combination of data about two different people into the database of a credit reference agency. Human error was at fault, as the two individuals lived in the same area and had the same names. At the time, the credit reference agency had a policy of matching up similar data. A particular financial institution was supplying personal data to the agency, but between the two records became intermingled. The DPC upheld the complaint.

Case Study 6/1999.

A principle seldom becomes obsolete unless legislative action deems it so. At issue here was an issue that remains a problem in the context of personal data processing. The complainant had repaid a loan, but the credit reference agency’s files showed the loan as a default. For clarity, we are still talking here about provisions under the ‘old acts’,  but as was found in this case, not keeping records “up to date” is a breach under the GDPR.

Case Study 8/1997

A credit reference agency’s records showed that the complainant had had a loan written off. That was correct. It also stated that litigation was pending for the non-payment of the loan. This part of the record was incorrect. No action was pending. As a result of the investigation, the DPC found the record held, “was inaccurate in stating that litigation was pending”. This case shows that even though the agency had some factually correct personal data, and few would advance monies to the complainant on the basis of the default, there was an inaccuracy in their records.

Case Study 6/1999

 Inaccurate credit rating assessments of a complainant gave rise to this case. Three loans had been taken out by the defendant and all three had been fully paid off. However, the agency wrongly recorded one as still outstanding. What was stated by the DPC remains true, and it is that there is a “clear and active obligation on data controllers to ensure that data is kept accurate and up to date”. The concept of ‘reasonableness’, referred to above in Smeaton v Equifax, is an abiding concept.

Case Study 12/2009.

 Here the results of a paternity test, a very sensitive issue, were sent to the wrong address. They were read by the complainant’s neighbour who now knew that his neighbour was not the father of child X.


Case Study 18/2009

What happened here was that a court summons was incorrectly served. It was served to the wrong person. As far as I remember, this was another that ended up at a neighbour’s house. Something most of us would naturally prefer not to happen.

Recently, (30th April 2019), The DPC issued an examination of the right to rectification complaints and it is accessible here. At its core is an attempt to clarify aspects of the right to rectification. As we mentioned above, there is a strong relationship between the right to rectification and the principle of accuracy. What the DPC notes is that ” Individuals have a right to rectification of their personal data under data protection legislation. What the right to rectification means in practice will depend on the circumstances of each case and the Data Protection Commission (DPC) examines each case that comes before it on its individual merits.” In practice, this means that all data controllers will be required to take all reasonable steps to ensure the accuracy of the personal data, taking account of the circumstance of the processing, the nature of the personal data and in particular, the purposes for which they are processed.

“In respect of complaints received by the DPC in relation to the recording of a name without diacritical marks, e.g. the síneadh fada in the Irish language, consideration has to be given, in light of Article 5(1)(d)  and Article 16 GDPR, to whether the recording of a name without diacritical marks is deemed to be inaccurate, having regard to the purposes for which the data (in this case, a data subject’s name) are processed”. This is a reference to the Ciarán Ó Cofaigh case reported in the Irish Times here.  What if a  John Coyle ( with excellent credit rating) had credit record details that identified him as a John Boyle with poor credit rating?   Is there really a difference between a mistaken letter in a person’s name and a missing fada, especially where the omission or the mistake can  result in a detriment to the data subject?  ( Or in this case,  is it discrimination against a Gaeilgeoir?) Your name is either correct or not correct, and this is not a hair-splitting exercise. Simple mistakes happen, but they must be rectified and made accurate before there is a detriment to the data subject.

“In a related context, the European Court of Human Rights has concluded that the omission of diacritical marks from a person’s name in certain official documents did not entail a breach of the right to private and family life guaranteed under Article 8 of the European Convention on Human Rights: see, for example, Šiškins and Šiškina v Latvia (Application no. 59727/00, 8 November 2001).” Expect more related cases, but under the GDPR these will be going to the CJEU.

Patrick Rowland, GDPRXpert. ie

Data Protection Consultants, GDPRXpert,  based in Carlow/Kilkenny and Mayo, provide a nationwide service.

Visit to learn more. 


DPC and Facebook Square off Again.

In our blog of March 21 st we did a general overview of some of the problems facing Facebook, most notably in the U.S, and involving various regulatory bodies. At that time we alluded to the fact of pending trouble this side of the pond. In the infamous and immortal words of the legendary American baseball player, Yogi Berra, it seems very much like, “it’s déjà vu all over again”. Reports from the office of the DPC concerning developments in its investigations would seem to bear this out.

In the same blog, and in reference to the first Annual DPC Report of the new DPC, we had pointed out the substantial number of data breaches reported by multinationals. Facebook was one of those multinationals and the Facebook Token breach became subject to a statutory inquiry in Sept. last year. Now a report confirms that Facebook, or one of its subsidiaries, has had 11 statutory inquiries by the office of the DPC initiated against it over varying periods.  (See the full article by Adrian Weckler, Technology Editor, Irish Independent.) It is a confrontation that seems endless.

In the Left Corner, Weighing in at...

As part of an ongoing investigation by the Justice Department’s securities fraud unit, Facebook now expects to pay between $3bn and $5bn. Political consulting company Cambridge Analytica had improperly obtained the Facebook data of 87 million users and used the data to build tools that helped Trump’s campaign in 2016.  (For more details, refer to our previous blog on the American investigations here). At the centre of the current probe is the admission by Facebook in its notification to the DPC, that millions of passwords were stored in totally unsecure ‘plain text format’. Facebook had discovered, “that hundreds of millions of user passwords, relating to users of Facebook, Facebook Lite and Instagram, were stored by Facebook in plain text format in its internal servers,” said a statement from the Irish DPC.

Dangerous Tactics

Storage of passwords in this manner leaves them especially exposed to those with access to certain internal services. It is always recommended, and it is good practice, to store passwords in an encrypted format, thereby allowing websites to confirm what you are entering without actually reading it. What is normal practice is that a password is ‘hashed’ and ‘salted’ which includes using a function called “scrypt” as well as a cryptographic key.

In cryptography, a ‘salt’ is random data that is used as an additional input to a one- way function that ‘hashes’ data, a password or passphrase. This allows the data security team to irreversibly replace a user’s actual password with a random set of characters. With this procedure, a user logging in is validated as having the correct password, without any need to resort to storing the password in plain text.  Hardly something to be considered ultra high tech or ‘rocket science’ for the average IT and  Data Security team!

The Bell for the End of Round 11.

Somewhat surprisingly, Facebook’s ‘bottom line’ does not seem to be suffering as badly as analysts had been predicting. Sales were up 26% for the first quarter of 2019 to close to $16bn. User numbers also increased, but at a lower rate of just 8%. Market analysts also got Facebook share price expectations wrong. In the year to date, Facebook shares have risen 40%, outperforming much of the wider market. It still has 2.38 billion account holders. Ultimately, much could change when the results of all the investigations become public knowledge. What will the public’s perception of Facebook’s privacy and data protection policy be when all investigations conclude?  Negative public sentiments have so far not affected Facebook’s bottom line. People are creatures of habit and change can sometimes be excessively challenging and inconvenient.

These investigations are warning signs for the company and investors alike. Maybe Rick Ackerman’s insight may be more prophetic than speculative. “Even the rabid weasels who drive the company’s shares wildly up and down for fun and profit must be sensing by now that Facebook is no longer cool (think AOL) and that the company has seriously depleted its store of goodwill”. ( Strong words, but a UK government report found Facebook had behaved like ‘digital gangsters’.


Winners and Losers?

In a post back in March, Mark Zuckerberg stated, “I believe the future of communication will increasingly shift to private, encrypted services, where people can be confident what they say to each other stays secure, and their messages and content won’t stick around forever.  This is the future I hope we will help bring about.” Their business practices will have to go through at least an ethical overhaul.  At present, they rely on a $55bn advertising revenue stream that comes from products and services that do not have end to end encryption. They are not private to any substantive and quantifiable measure. Yet if the business model is also under increased pressure from data protection and privacy regulators in different jurisdictions, then in theory at least, it must change sooner rather than later.

If this model is to be replaced, the question must be what type or form it will take? Most analysts suggest that Whatsapp and Messenger are the future because Facebook’s data show that is where people are increasingly spending their time. If people move to private messaging apps with high levels of encryption, as Zuckerberg stated is the future and their policy, Facebook will still need data to use to target people with ads. What will be the source of the data needed to target these people? More relevant is whether or not this will be done in accordance with GDPR and data protection legislation?  Will their business model stay substantially the same, just being delivered by different vehicles? Will it be ‘free’? A subscription model is unlikely because how many people will actually pay for ‘likes’, and to interact with their ‘friends’? (No ads, but…)

Future Re-Match

It is conceivable that for many years to come, as one inquiry ends, another starts. Maybe it will be a case of, “the more things change, the more things stay the same”. Cost-benefit analyses done by Facebook may be adjusted once the fines begin to mount up. One certainty is that Facebook will not be allowed to disregard the GDPR and privacy legislation in numerous jurisdictions. Thankfully, for data protection and privacy advocates, the office of the DPC is committed to its mission. It is seriously ‘punching above its weight’.  Facebook will find, like many before it, that its financial resources do not afford it special treatment, or confer special status in the eyes of the law.

P.S. For another angle on the subscription ideas see

Patrick Rowland,

We are Data Protection consultants, based in Carlow/Kilkenny and Mayo, offering a nationwide service.

Visit to learn more


The GDPR and Elections on the Horizon

What with the never-ending Brexit saga and the continuing toxic political stalemate, it may not be popular to delve into any topic with political associations. Nevertheless, this is the intention, and it is one that is primarily inspired by the upcoming European and Irish local elections. Both sets of elections have a novel element, to the extent that they will be the first Euro and local elections to take place since the introduction of the GDPR and The Data Protection Act 2018.

Key actors on this changed set include a number that may play the role of the data controller. In previous blogs and on our website,  we have seen how the notion of accountability of controllers and joint controllers is a central feature of the GDPR. Individual election candidates, political parties, data analytics companies and public authorities responsible for the electoral process can all act as controllers. It is not within the scope of this blog to discuss all facets, and so the less ambitious plan is to look at election candidates in the light of canvassing related activities. Data protection issues arise whenever personal data is being collected, and at election times it is collected in different forms. Canvassing door to door, direct mail marketing, and electronic direct marketing may raise concerns. More data protection issues surface in relation to requests for representation. Inevitably, organisations that receive these requests must also come under the same scrutiny.


Getting Started.

The focus of this article lies within the confined context of elections and electoral activities, and the application of Union and National law within this defined landscape.  Micro-targeting of voters by unlawful processing of personal data is still fresh in peoples’ minds following on from Cambridge Analytica and other similar disclosures. A starting point is to acknowledge, as stated by the UK’s Information Commission Office (ICO), that, “engaging voters is important in a healthy democracy, and in order to do that, political parties, referendum campaigners and candidates will campaign using a variety of communication methods. However, they must comply with the law when doing so; this includes the handling of the personal data that they collect and hold”. (ICO, Guidance on Political Campaigning; more details here). This is especially true since the inception of the GDPR, The Data Protection Act 2018 and the E-Privacy Regulation.

The Very Basic Ground Rules.

Individuals now have enhanced rights, and these are strengthened and particularly relevant in the electoral context. These rights place onerous responsibilities on candidates seeking office. Primary responsibilities fall on the candidates and affiliated parties where that is the relationship. Public authorities also have responsibilities under the GDPR and the various Electoral Acts. Whoever is processing the personal data, previously viewed as purely mundane, such as names and addresses, must now pay more attention. Simple names and addresses represent ‘personal data’ under the GDPR. Processing of such data must now be done lawfully, fairly and in a transparent manner, and for a defined specified purpose.  Another limitation (purpose limitation) means the personal data are now less likely to be strategically stored with ulterior motives in mind. Data cannot be further processed in a manner incompatible with the purposes for which the data were initially collected. (Note: a few strict exceptions to this rule). There must be some lawful basis for personal data processing. All data protection principles must be followed without any ‘cherry-picking’.

Pre-election days.

Most of us are well used to the barrage of literature that ends up on our hall floors in the run-up to elections. S.39 DPA 2018 specifically allows for the use of personal data for the purpose of communication in writing (including newsletter or circular) with the data subject.  It is qualified and is limited to  ‘specified persons’, namely: a political party; a member of either house of the Oireachtas, the European Parliament or a local authority; or a candidate for election to the office of  The President of Ireland or for membership of any of the above mentioned. Here the DPA 2018 provides a lawful basis for processing. There is a useful guidance booklet available at the DPC website.  Section 59 DPA 2018 expressly modifies Art. 21 GDPR, so that there is no right to object to electoral direct marketing by post. When communicating by text, e-mail, phone or fax a candidate must have prior consent from the constituent. If contact is then lawfully made, it must be clear about its origin, and it must incorporate an easy opt- out.

Frantic times precede elections and normal rules should apply, but it is probably unrealistic for the office of the DPC to expect candidates to include the amount of information described within this pamphlet with their canvassing materials. What is more unrealistic is to expect the same information to be given by a candidate when going door to door. Time constraints make this impractical. It will be interesting to hear from candidates after the elections about how the new regulations and the DPA 2018 affected their campaigns. Just as interesting will be any feedback from constituents concerning the information they were given regarding data protection rights from candidates. Under Art.9 GDPR there is a general prohibition on the processing of special categories of personal data but there are exceptions to the rule. Section 48 DPA 2018 expressly permits one of these special categories of personal data (revealing political opinions) to be processed. Provided safeguards are taken to protect the data subject’s fundamental rights and freedoms,  such data can be processed in the course of electoral activities by a political party, a candidate for election to, or a holder of, elective political office in the State. (Note: applies to Referendum Commission also). The specific purpose is the compiling of data on people’s political opinions. Section 48 DPA 2018 provides a lawful basis.


Requests for Representation

In the course of canvassing at election time, candidates receive numerous requests for representation regarding access to services or the provision of services. Whilst such requests are genuine, they represent a higher level test of a candidate’s knowledge or ability, as perceived by the voter, in the run-up to the election. If a voter gets a favourable and swift response from the candidate the chances are he/she will also get a vote. It is only proper that no short cuts are taken to get this information quickly before Election Day.  When we speak of ‘candidates’, it is important to distinguish between candidates who are current officeholders seeking re-election, and those running for election who do not currently hold any office.  All elected representatives should be aware of, or quickly become familiar with S. 40 Data Protection Act (DPA) 2018. In fact, they should be more conscious of, and extra vigilant in their responsibilities because the number of requests will increase exponentially at election times. Being knowledgeable on S.40 potentially benefits the representative’s reputation. Passing on the relevant specifics of that knowledge to the constituent will deliver benefits to both parties. After the introduction of the GDPR, the office of the DPC issued interpretive guidelines on S. 40 DPA 2018.  Data protection consultants GDPRXpert have the document available here.  We will now look at some of the main points from the guidelines.


Some Guidelines on Requests for Representation.

Sections 40(1) and (2) DPA 2018 gives the legislative basis to elected representatives for the processing of personal data of constituents. This includes the special categories of personal data from Art. 6 GDPR. Processing is allowed where the elected representative either receives a request for representation directly from the data subject or where the elected representative receives a request for representation from another person on behalf of the data subject. In all cases, the elected representative must be able to demonstrate that they are compliant with the principles of data protection.  At a minimum, representatives are obligated to meet their transparency responsibilities set out in the GDPR (especially Arts. 12, 13 &14). An elected representative has an obligation to be certain at all times they are acting upon a request from the voter.


There will be many situations where “the permission can be implied from the relevant action or request. For example, the raising of the matter by an individual will create an expectation that their personal data will be further processed by the elected representative and other relevant organisations”. (DPC Guidelines 2018 p.4)  A normal expectation is that personal data will be processed.  As part of the representative‘s request for information, the local council, for example, will disclose personal details necessary to satisfy the request. However, best practice is to be sure that the constituent is aware of the likely processing, and we recommend a signed consent form. In many instances, a formal, signed consent form may not be practical. Contemporaneous detailed notes should be taken by the representative, and the DPC suggests this as a good record to demonstrate compliance with S.40DPA 2018.

If any unexpected processing becomes necessary it is advisable to revert to the constituent. An elected representative must be careful not to go beyond the specified purposes for which the consent was given. One recommendation from the DPC is that elected representatives should use Privacy Notices when they collect personal details from people and have a Privacy Notice on their website. All notices should meet the transparency requirements and “satisfactorily address the requirements set out in Articles 12, 13, 14 &30 (where relevant) of the GDPR and also should be clear, accessible and informative to help people understand what will be done with their personal information”. (Office of the DPC, 2018) Simple, best advice: be straight with people on all aspects. Following data protection principles will operate to safeguard both the constituent and the elected representative.


Requests From Someone on Behalf of Someone Else

In this scenario, all parties should be extra cautious. Here a request is being made by one person on behalf of another. Common situations include son/daughter on behalf of one or both parents; some family member on behalf of another family member; relative on behalf of another relative; neighbour/friend on behalf of another neighbour/friend, etc. It is no longer sufficient to take the word of one party and accept the bona fides.  Therefore, the elected representative will have to ensure that the individual making the request has the authority from the person whose personal data will be processed on foot of the request. This is a potential minefield and the onus lies on the representative to “demonstrate the data subject has consented to the processing of his or her personal data” (Art.7 (1) GDPR).

Trust is no longer a reliable basis on which to proceed with such a request. Other aspects that merit detailed attention include the competency of the data subject and the legal standing of the person making the request.  For example, is there an enduring power of attorney to manage the affairs of the data subject? Any prudent representative should strive to have a signed consent form provided. Failing that, it will be a decision for the representative whether or not to make a representation.  Where the representative has not been able to fully ascertain the wishes of the individual prior to processing of personal data, he or she should have set out and recorded the specific steps taken to ascertain those wishes. Such records will stand as evidence of reasonable efforts having been made.  This will be crucial at the time the representation is made to the appropriate organisation.


Disclosure by an Organisation following a Request under S.40 DPA 2018.

Written Requests

As noted earlier, there is an exponential increase in the number of requests for representation approaching election time.   Section 40 (4) DPA 2018 gives an organisation the legal basis to respond to and process the personal data on foot of a representation from the elected representative. In doing so the organisation must demonstrate compliance with all the data protection principles under Art. 5 GDPR. A precondition is that the disclosure is necessary and proportionate to enable the representative to deal with the request, and safeguards referred to in S. 36 DPA 2018 are taken. Special categories of personal data are allowed to be processed by the organisation under S. 40(4). Where the organisation receives a written representation on foot of S.40 the organisation can assume the constituent has given permission. In other words, it can accept the bona fides of the representative while at the same time satisfying itself it is reasonable to assume the individual would have no objection to the release of the personal data.

Verbal Requests

With verbal representations from an elected representative to an organisation,  it is advisable that a staff member of the organisation logs appropriate details.  Where the elected representative is present when the representation is made it is good practice to have a short form confirming the details signed by him /her. Best practice is for the organisation to have policies and privacy notices in place that outlines how the organisation deals with requests. Ultimately, the organisation decides whether to accede to requests made. In particular, the organisation must ensure they meet responsibilities under Art. 12, 13, 14 and 30 GDPR. Any disclosure must be only what is necessary and proportionate in its impact on the fundamental rights of the individual. An organisation must consider the potential impact and negative implications of any representation and take safeguards to mitigate any risks.

Mitigating Risks

Mitigating risk must reach a higher level of security in the context of special categories of personal data. These are by their nature sensitive. Extra safeguards are advisable where the representation has been made on behalf of the data subject by another individual due to incapacity or age. Where the personal data falls under the special category class, any safeguards must be strengthened.  It must always be borne in mind that reliance on S.40 (4) DPA 2018 as a legal basis to disclose data on foot of a representation is dependent on certain conditions being met in advance: any processing must be necessary and proportionate and suitable measures must be taken to protect the individual’s rights and freedoms. If an organisation acting on foot of a representation has any concern about the level of awareness on the part of the representative or individual, in relation to the sensitive nature of the personal data, it would be prudent to refer back to both. It is only proper that the individual is fully aware of the implications that will follow the processing of their personal data as a consequence of the request. Both the nature and purpose of the request will influence actions taken. For example, some requests may be time-sensitive and getting explicit consent may not be practical. The DPC advises a common-sense approach be taken.


Personal Data of Third Parties

As a general rule, it is not permissible to process the personal data of third parties under S. 40 DPA 2018. This is allowed under very limited circumstances. If a third party has not been involved in a request for representation processing of  personal data of that third party will not be permissible unless one of the following apply: the third party cannot give explicit consent; the processing is necessary in somebody else’s interest and explicit consent has been “ unreasonably withheld” by the third party; the balance favours the disclosure in the public interest; the elected representative “cannot reasonably be expected to obtain” the third party’s explicit consent; seeking the third party’s explicit consent would “prejudice the action taken by the elected representative.

Other Considerations Re Special Category Data

Earlier we noted how S. 48 DPA 2018 allowed for one category (personal data revealing political opinions ) within the ‘special categories’ grouping. However, S. 40 (1) DPA 2018 allows the general processing of personal data within these special categories. The elected representative in processing such categories must “impose limitations on access to that data to prevent unauthorised consultation, alteration, disclosure or erasure of that data” (S.40 (3) DPA 2018).  In conjunction with these limitations, suitable and specific measures that take on board the provisions of the Data Protection Health  Regulations ( S.I No.82/1989 and S.I. No. 83/1989) should be considered, as these remain in force under S. 58 DPA 2018. Both equally apply to the elected representative and the organisation receiving the representation. These regulations provide that health data relating to an individual should not be made available to an individual, in response to an access request, if that would be likely to cause serious harm to the physical or mental health of the individual.

If a person is not a health care professional, he or she should not disclose health data to an individual without first consulting that individual’s own doctor, or some other suitably qualified healthcare professional. Where it has been deemed appropriate to disclose such information to an elected representative it should include a warning in regard to the sensitive nature of the data. The elected representative will need to apply safeguards outlined in S. 40 (3) DPA 2018. Finally, in relation to processing of personal data that involves criminal convictions or offences (Art. 10 data), any disclosure on foot of a representation will necessitate an assurance from the representative that explicit consent has been obtained for the request.

Much of the foregoing is evidence of the complicated nature of data protection in the context of electoral activities.  A high level of awareness is expected from elected representatives and from organisations that receive representations from them. Once the relevant information is provided by the elected representative a decision should be common sense based. The Office of the DPC believes any refusal by the organisation should be easily explained by reference to S.40 DPA, without citing data protection requirements as a general ground for refusal. Where the organisation has followed S.40 (4) DPA 2018, the GDPR, data protection principles and implemented suitable and specific safeguards it should be confident it has acted in compliance with DPA 2018.

Patrick Rowland,

Data Protection Consultants, with bases in Carlow/Kilkenny and Mayo, offer their expert service nationwide.

Visit to learn more.


More Problems for Facebook


Facebook has not been unused to controversy, especially over the last year. In our most recent blog in relation to the first Annual Report of the new DPC, we pointed out the substantial number of data breaches reported by multinationals. Facebook was one of those multinationals, and the Facebook Token breach became subject to a statutory inquiry by the office of the DPC  in Sept. last year.  Now, in the US, federal prosecutors are conducting an investigation into data deals Facebook struck with some of the world’s largest technology companies. (NY Times, March 13, 2019

Grand Jury Investigation.

A grand Jury in NY has subpoenaed records from at least two prominent makers of smart phones and other devices. Partnerships with Facebook gave these makers very broad access to the personal information of possibly hundreds of millions of Facebook users. This had been going on for years,  and operated to allow the makers, along with  companies such as Microsoft , Apple, Sony and Amazon, to see users’ friends contact information and other information, most often without any consent. These agreements were previously reported  in The New York Times. (Link to original article here.) Most of the partnerships have now been phased out. However, while it was in operation, the partnerships effectively gave these partnership companies a blanket exemption from the usual privacy rules.

Hundreds of pages of Facebook documents were obtained by The New York Times. These  records, generated as far back as 2017 by the company’s internal system for tracking partnerships, provided the most complete picture yet of the social network’s data-sharing practices. The exchange was intended to benefit everyone. Facebook got more users boosting its advertising revenue, and partner companies acquired features that made their products more attractive. For example,the records show that  Facebook allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, and gave Netflix and Spotify the ability to read Facebook users’ private messages. Facebook users connected with friends across different devices and websites, reaping benefits for Facebook who had engineered extraordinary power over the personal data of more than 2.2 billon users. Prior to the GDPR, even in Europe, this power was exercised with a shameless lack of transparency and a dearth of substantive oversight.

Other investigations.

The latest grand jury inquiry comes amidst the backdrop of the Cambridge Analytica scandal where the political consulting company had improperly obtained the Facebook data of 87 million users and used the data to build tools that helped Trump’s campaign in 2016. This is part of an ongoing investigation by the Justice Department’s securities fraud unit. All along, Facebook’s position was that they had been misled by Cambridge Analytica, and had believed that the data were only being used for academic purposes. “In the furore that followed, Facebook’s leaders said that the kind of access exploited by Cambridge in 2014 was cut off by the next year, when Facebook prohibited developers from collecting information from users’ friends. But the company officials did not disclose that Facebook had exempted the makers of cell phones, tablets and other hardware from such restrictions” (NY Times, June 3, 2018.)   Neverthless, some of the fine print on a quiz app that collected the data, which Facebook deleted way back in 2005, was evidence that the company knew about the potential for the data to be used commercially.

Facebook’s Wheeling and Dealing.

The pervasive nature of some of the deals that Facebook initiated become clearer when, for example, the evidence shows that  one deal empowered Microsoft’s Bing search engine to map out the friends of virtually all Facebook users without their explicit consent, and allowed Amazon to obtain users’ names and contact information through their friends. Apple was able to conceal from Facebook users any indicators that the company’s devices were even asking for data.  (NY Times, March 13, 2019). See link at top of blog). This demonstrates the covert level involved. An investigation that is still in progress gives an insight into the business and corporate psyche of the business model that Facebook is proud to espouse.  Facebook entered a data sharing consent agreement with the Federal Trade Commission in 2011. In this consent agreement, Facebook were barred from sharing user data without explicit consent.However, agreements which Facebook concluded, benefited more than 150 companies — most of them tech businesses, including online retailers and entertainment sites, but also automakers and media organizations. Their applications sought the data of hundreds of millions of people a month. The deals, the oldest of which date to 2010, were all active in 2017. Some were still in effect in late 2018 (NY Times, Dec. 18, 2018).

The Spin.

Facebook’s spin on it was that the companies they entered into agreements with were, ‘extensions of itself’ and not subject to the specific data sharing rules. After all, one can’t really share a secret with oneself!  The service providers were just partners that allowed users to interact with their Facebook friends. Facebook dismissed the notion that they stood to gain substantially from the arrangements, despite admitting that they had not really policed the activities of their partners. Data privacy experts are rightly sceptical that a regulator, as thorough as the Federal Trade Commission, would view these businesses as being ‘alike’. With its experience, the FTC is hardly going to consider businesses as varied as device makers, retailers and search companies as being alike, to such an extent as to be exempt from the regulation. It seems this was Facebook’s opinion. But former chief technologist at the Federal Trade Commission, Ashkan Soltani, saw it as nothing more than a ruse, stating, “The only common theme is that they are partnerships that would benefit the company (Facebook) in terms of development or growth into an area that they otherwise could not get access to”.


In summary, Facebook has trouble on quite a few fronts: the original Cambridge Analytica investigation has now involved Facebook being investigated by both the FBI and the securities fraud unit of the Justice Department; the Federal Trade Commission is close to finalising its investigation into possible violation of the consent agreement ( multi-billion $ fines are anticipated) ; the Justice Department and the Securities and Exchange Commission are investigating Facebook and the U.S Attorney’s Office for the Eastern District of New York is heading a criminal investigation. (Remember, at the moment we are not talking about Europe and GDPR!!) The signs are ominous and expect to hear more from us, and others, on Facebook’s  problems in the near future.

On March 19, Rep David Cicilline (D-RI), head of the House of Representatives Judiciary Committee called for the FTC to investigate Facebook on the grounds of anti-monopoly law.

Patrick Rowland,

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit




DPC Issues Annual Report

The  DPC’s first annual report since the GDPR has just been released. It is  not surprising to observers of developments in the data protection field that at the outset the report remarks , “it is the rise in the number of complaints and queries to data protection authorities across the EU since 25 May 2018 that demonstrates a new level of mobilisation to action on the part of individuals to tackle what they see as misuse or failure to adequately explain what is being done with their data”.(DPC Report, 2018) It is fair to say that pre-GDPR there was very much hype and alarm and this amplified the closer it came to D-Day, May 25th, 2018. Things have changed somewhat since then and if, “ we understand something about the GDPR, it is this: it will be a process of dialogue that lasts many years and the dialogue will need to shift and change with technology, context, learning from evidence (including emerging case law) and evolving societal norms.”(DPC Report, 2018)

We spoke in an earlier blog, and we allude to it on this website, about some misinformation and disinformation that unfortunately increased the sense of alarm and panic pre-GDPR. After May 25th there was more.  It seems the hairdresser who cited GDPR as the reason she could not give her customer details of the hair dye she was using in her customer’s hair is the favourite GDPR myth within the office of the DPC. By the way, the hairdresser’s customer was leaving to go to another hairdresser and wanted to be able to tell the new hairdresser what colour went in her hair, but we can be sure that this had nothing to do with the hairdresser’s response!

Some Facts  From the Report.

  • 2,864 complaints, of these the largest single category was in the category ‘Access Rights’ with 977 complaints, or a little over 34%of the total.
  •  1,928 were complaints under GDPR and of these 868 had been concluded.
  •  total of 3,452 data breaches recorded with the largest single category being ‘Unauthorised Disclosures’ and 38 breaches related to 11 multi-national technology companies.
  •  almost 31,000 contacts were made to the Information and     Assessment unit within the DPC.
  • 15 statutory inquiries (investigations) were opened in relation to the compliance of multinational companies with GDPR.
  • 16 requests  –formal and voluntary- for mutual assistance from other EU data protection authorities.
  • 31 own volition inquiries under the Data Protection Act 2018 into the surveillance of citizens by the state sector, for law enforcement purposes, through the use of technologies such as CCTV, body-worn cameras, automatic number plate recognition, drones and other technologies. These inquiries are conducted by the Special Investigation Unit. This same unit continued its work in relation to the special investigation into the Public Services Card that we have featured on our website recently.
  • 950 general consultations were received, excluding the consultations with multinational technology companies.
  •  900 data protection officer notifications.

In late 2018, the DPC established an advanced technology evaluation and assessment unit (the Technology Leadership Unit – TLU) with the objective of supporting and maximising the effectiveness of the DPC’s supervision and enforcement teams in assessing risks relating to the dynamics of complex systems and technology.

So it has been a busy and productive time for the office of the DPC and they even got time to speak at over 110 events including conferences, seminars and presentations. Late last year the DPC commenced a significant project to develop a new five-year DPC regulatory strategy that will include extensive external consultation during 2019.   It has to be remembered that The DPC received complaints under two substantive parallel legal frameworks during this period:

  • complaints and potential infringements that related to, or occurred,                 before 25 May 2018, must be handled by the DPC under the framework    of the Data Protection Acts 1988 and 2003;
  • and in addition and separately, complaints received by the DPC relating to the period from 25 May 2018 must be dealt with by the DPC under the new EU legal framework of the GDPR and Law Enforcement Directive and the provisions of the Data Protection Act 2018, which give further effect to, or transpose those laws into the laws of Ireland as a Member State of the EU.

The DPC took an active part in the Global Privacy Enforcement Network (GPEN) 6th annual privacy sweep. Data protection authorities from around the world participated and the theme in 2018 was privacy accountability. Accountability is a central element of GDPR. It is a concept that, “requires organisations to take necessary steps to implement applicable data protection rules and regulations, and to be able to demonstrate how these have been incorporated into their own internal privacy programs” (DPC Report 2018).  In the last sweep GPEN aimed to assess how well organisations have implemented accountability into their own internal privacy programmes and policies. One goal was to establish a sort of baseline of an organisation’s compliance with data protection. This was the brief for the DPC, as their input was targeted at randomly selected organisations in Ireland. 30 organisations across a range of sectors completed a suite of pre-set questions relating to privacy accountability. Because the sweep was done in the last quarter of 2018 only preliminary or provisional results are available to date of report. Preliminary results include the following:

  • 86% of organisations have a contact listed for a DPO on their website
  • 75% appear to have adequate data breach policies in place
  • All organisations seem to have some kind of data protection training for staff However, only 38% could provide evidence of training for all staff including new entrants and refresher training
  • In most cases organisations appear to undertake data protection monitoring/self- assessment but not to a sufficiently high level. In this category, 3 out of 29 scored ‘poor’ , while 13 could only reach a ‘satisfactory’ level
  • 1/3 of organisations were unable to show any documented processes in place to assess risks associated with new technology and products
  • 30% of organisations failed to show they had an adequate inventory of personal data, while close to  50% failed to keep a record of data flows

These again are preliminary, and the full results will be more instructive. It is to be emphasised that 30 organisations represent a small sample size.  Nevertheless, there seems to be large deficiencies in staff training and data protection monitoring/ self- assessment. Many issues will be more fully addressed in the coming months when the results of the ‘sweep’ will be available.




Public Services Card and Biometric Data.

In our last blog, February 15th, we looked at some arguments raised in the continuing debate surrounding the public services card. There are other aspects to the debate that we will consider now. Amongst these aspects are the special categories of data that are treated differently under the GDPR than more ‘ordinary’ categories. Any general data processing rules, applicable in the case of ordinary categories of personal data, change or become redundant if the data falls within the ‘special category’ definition. Two topical data protection issues dominate this blog: the prohibition, or otherwise, on biometric data processing; and whether the public services card photograph is within the definition of ‘biometric data’.
                       GDPR and Special Categories of Data.
Art. 9 GDPR delineates the categories of data that are covered under the special category umbrella. Their treatment under GDPR differs from other categories because of the sensitive nature of the data. Biometric data, “for the purpose of uniquely identifying a natural person” is included under Art. 9 (1). Art. 9 (1) also includes a prohibition on processing of all the other special categories of data. It is difficult to understand why this prohibition has caused so much confusion and erroneous interpretation. In order to avoid doubt as to the intent, practical application and effect of Art. 9, it is prudent to first examine it in its entirety.
Recently the following appeared on the RTE website, “Article 4 of the GDPR especially says facial images are biometric data, Article 9 of the GDPR specifically says it is illegal to process biometric data. ( The reference to Art. 9 is not correct. In the first place, it does not use the word ‘illegal’, and secondly, although Art. 9 lays out a prohibition on processing of special category data that includes biometric data, it immediately sets out the exceptions to the general rule. There are many exceptions and these range from Art.9 (2) (a), through to Art. 9 (2) (j). Initially, the general rule is laid out and then the exceptions to the rule follow. Reading the text fully helps to avoid broad misstatements of fact.
                                        There are exceptions to the rule!


                                       A Taste of the Exceptions to the General Rule.

• processing where the data subject has given ‘explicit consent’ to the processing (unless where Union or MS law provide that the prohibition may not be lifted);
• processing is necessary to protect the vital interests of the data subject or of another natural person where the data subject is physically or legally incapable of giving consent ;
• processing relates to data which are manifestly made public;
• processing is ” necessary for the purposes of carrying out the obligations and exercising specific rights of the controller or of the data subject in the field of employment and social security law,  in so far as it is authorised by Union or MS law…providing for appropriate safeguards for fundamental rights…” ; (So if the Dept. was processing biometric data in relation to the data subject’s PSC, then this would be legitimate if provided for by law. Again, the prohibition is not a blanket prohibition,  as the quote from RTE website would suggest.)
• processing is necessary for the establishment, exercise or defence of legal claims or whenever courts are acting in their judicial capacity;
• processing is necessary for reasons of substantial public interest, on the basis of Union or MS law (but is proportionate, respects rights and provides safeguards).

                    Another nibble at the exceptions to the rule.

There is also an exception for processing that is necessary for the purposes of preventive or occupational medicine and where necessary for reasons of public interest in the area of public health, such as protecting against serious cross border threats to health. What all this shows is that there are numerous exceptions to the general rule. Section 73 DPA 2018 closely follows GDPR on this with S. 73 (2) providing that regulations may be made permitting the processing of special categories of data for reasons of substantial public interest. This flows from the discretion allowed to Member States under Art. 9(3). Art. 9(3) gives discretion to the member States to maintain or introduce further conditions, including limitations, with regard to the processing of genetic data, biometric data or data concerning health.

On this issue, therefore, there is only one conclusion. GDPR does not set out a blanket prohibition on the processing of biometric data. It is a prohibition that is subject to and qualified by, numerous exceptions. Prohibition on processing is waived in the situations expressly stated under Art. 9(1) and 9 (2).

Public Services Card and Biometric Data.

Biometric data is a recurring theme in the public services card debate. This debate centres around one particular feature of the card. It focuses on the photograph taken when applicants present themselves at designated offices to register for the card as part of the SAFE process. SAFE stands for Standard Authentication Framework Environment. It is a standard for establishing and verifying an individual’s identity for the purposes of accessing public services. Is this photograph biometric data? Many people take the view that this photograph is exactly that. The GDPR has laid out a position on this topic.

Art.4 (14) defines biometric data as, “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”. Section 69, DPA 2018 shares this definition except where it replaces ‘natural person’ with ‘individual’. There is a view that ordinary photographs do not constitute biometric data. It may be the case that all photographs are facial images, but not all facial images are biometric data. This is not to initiate an exercise in semantics, but there are technical differences that distinguish one from the other. GDPR has attempted to clarify the distinction by the precise nature of its text. Accordingly, it is the text itself that is most instructive in this particular context.

Verification and Identification.

An obvious purpose for biometric data is recognition of individuals and this takes two forms; identification, followed by verification. Identification is the less complicated of the two, and centres on comparing the data to that of numerous other individuals. Verification aims at matching the physical, physiological and behavioural characteristics to biometric data of a specific individual that have been stored in a database. Identification may be made with a high degree of probability. Identification answers the question, “Who are you”, whereas verification answers the question, “Are you really who you say you are”. (See diagram and note in the appendix below) Verification is made with almost 100 % certainty.

What GDPR is clear about is this; only data concerning a person’s physical, physiological or behavioural characteristics that have gone through specific technical processing allowing the unique identification or verification of a natural person, qualify as biometric data. The essence of the distinction centres on the word ’unique’. There are no degrees of uniqueness. Something is not more unique, or less unique. It can only be unique, or not unique. Therefore, identifying something as unique sets it apart from all others. Can even a mother of identical triplets uniquely identify her three children individually, from a photograph of all three together (presuming that no one has identifiable scars)? GDPR had this in mind when including the word ‘unique’, and this is because it is the specific process after a photograph is taken that enables ‘unique’ identification or verification.

A quite specific process has to be carried out before it qualifies as biometric data. Special and varied aspects of a facial image can be assessed to aid the goal of unique verification. In the context of a facial image, distances from nose to mouth , between nose and mouth, between eyes and nose and from earlobe to earlobe, are examples of, and variations on the means to the end, Unique verification is the end. On this analysis, it is difficult to perceive ordinary photos as biometric data. A photo is a facial image. On its own, and in isolation, a facial image is not biometric data. A facial image must result from, “specific, technical processing” (Art.4 (14)).

GDPR Recital 51 states, “…The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when ( our emphasis) processed through a specific technical means allowing the unique identification or authentication of a natural person”.


Finally, the Department (Employment Affairs and Social Protection) in its guide to SAFE Registration and in answer to the question, “Does the Public Services card store biometrics”, states, “ No. While the card does store a person’s photograph it does not store the biometric or arithmetic template of that photograph”.
It does not use advanced facial mapping cameras when taking the photos as part of the SAFE registration process.

APPENDIX 1 &2 from biometric blog

Latest News