More Problems for Facebook


Facebook has not been unused to controversy, especially over the last year. In our most recent blog in relation to the first Annual Report of the new DPC, we pointed out the substantial number of data breaches reported by multinationals. Facebook was one of those multinationals, and the Facebook Token breach became subject to a statutory inquiry by the office of the DPC  in Sept. last year.  Now, in the US, federal prosecutors are conducting an investigation into data deals Facebook struck with some of the world’s largest technology companies. (NY Times, March 13, 2019

Grand Jury Investigation.

A grand Jury in NY has subpoenaed records from at least two prominent makers of smart phones and other devices. Partnerships with Facebook gave these makers very broad access to the personal information of possibly hundreds of millions of Facebook users. This had been going on for years,  and operated to allow the makers, along with  companies such as Microsoft , Apple, Sony and Amazon, to see users’ friends contact information and other information, most often without any consent. These agreements were previously reported  in The New York Times. (Link to original article here.) Most of the partnerships have now been phased out. However, while it was in operation, the partnerships effectively gave these partnership companies a blanket exemption from the usual privacy rules.

Hundreds of pages of Facebook documents were obtained by The New York Times. These  records, generated as far back as 2017 by the company’s internal system for tracking partnerships, provided the most complete picture yet of the social network’s data-sharing practices. The exchange was intended to benefit everyone. Facebook got more users boosting its advertising revenue, and partner companies acquired features that made their products more attractive. For example,the records show that  Facebook allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, and gave Netflix and Spotify the ability to read Facebook users’ private messages. Facebook users connected with friends across different devices and websites, reaping benefits for Facebook who had engineered extraordinary power over the personal data of more than 2.2 billon users. Prior to the GDPR, even in Europe, this power was exercised with a shameless lack of transparency and a dearth of substantive oversight.

Other investigations.

The latest grand jury inquiry comes amidst the backdrop of the Cambridge Analytica scandal where the political consulting company had improperly obtained the Facebook data of 87 million users and used the data to build tools that helped Trump’s campaign in 2016. This is part of an ongoing investigation by the Justice Department’s securities fraud unit. All along, Facebook’s position was that they had been misled by Cambridge Analytica, and had believed that the data were only being used for academic purposes. “In the furore that followed, Facebook’s leaders said that the kind of access exploited by Cambridge in 2014 was cut off by the next year, when Facebook prohibited developers from collecting information from users’ friends. But the company officials did not disclose that Facebook had exempted the makers of cell phones, tablets and other hardware from such restrictions” (NY Times, June 3, 2018.)   Neverthless, some of the fine print on a quiz app that collected the data, which Facebook deleted way back in 2005, was evidence that the company knew about the potential for the data to be used commercially.

Facebook’s Wheeling and Dealing.

The pervasive nature of some of the deals that Facebook initiated become clearer when, for example, the evidence shows that  one deal empowered Microsoft’s Bing search engine to map out the friends of virtually all Facebook users without their explicit consent, and allowed Amazon to obtain users’ names and contact information through their friends. Apple was able to conceal from Facebook users any indicators that the company’s devices were even asking for data.  (NY Times, March 13, 2019). See link at top of blog). This demonstrates the covert level involved. An investigation that is still in progress gives an insight into the business and corporate psyche of the business model that Facebook is proud to espouse.  Facebook entered a data sharing consent agreement with the Federal Trade Commission in 2011. In this consent agreement, Facebook were barred from sharing user data without explicit consent.However, agreements which Facebook concluded, benefited more than 150 companies — most of them tech businesses, including online retailers and entertainment sites, but also automakers and media organizations. Their applications sought the data of hundreds of millions of people a month. The deals, the oldest of which date to 2010, were all active in 2017. Some were still in effect in late 2018 (NY Times, Dec. 18, 2018).

The Spin.

Facebook’s spin on it was that the companies they entered into agreements with were, ‘extensions of itself’ and not subject to the specific data sharing rules. After all, one can’t really share a secret with oneself!  The service providers were just partners that allowed users to interact with their Facebook friends. Facebook dismissed the notion that they stood to gain substantially from the arrangements, despite admitting that they had not really policed the activities of their partners. Data privacy experts are rightly sceptical that a regulator, as thorough as the Federal Trade Commission, would view these businesses as being ‘alike’. With its experience, the FTC is hardly going to consider businesses as varied as device makers, retailers and search companies as being alike, to such an extent as to be exempt from the regulation. It seems this was Facebook’s opinion. But former chief technologist at the Federal Trade Commission, Ashkan Soltani, saw it as nothing more than a ruse, stating, “The only common theme is that they are partnerships that would benefit the company (Facebook) in terms of development or growth into an area that they otherwise could not get access to”.


In summary, Facebook has trouble on quite a few fronts: the original Cambridge Analytica investigation has now involved Facebook being investigated by both the FBI and the securities fraud unit of the Justice Department; the Federal Trade Commission is close to finalising its investigation into possible violation of the consent agreement ( multi-billion $ fines are anticipated) ; the Justice Department and the Securities and Exchange Commission are investigating Facebook and the U.S Attorney’s Office for the Eastern District of New York is heading a criminal investigation. (Remember, at the moment we are not talking about Europe and GDPR!!) The signs are ominous and expect to hear more from us, and others, on Facebook’s  problems in the near future.

On March 19, Rep David Cicilline (D-RI), head of the House of Representatives Judiciary Committee called for the FTC to investigate Facebook on the grounds of anti-monopoly law.

Patrick Rowland,

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit




DPC Issues Annual Report

The  DPC’s first annual report since the GDPR has just been released. It is  not surprising to observers of developments in the data protection field that at the outset the report remarks , “it is the rise in the number of complaints and queries to data protection authorities across the EU since 25 May 2018 that demonstrates a new level of mobilisation to action on the part of individuals to tackle what they see as misuse or failure to adequately explain what is being done with their data”.(DPC Report, 2018) It is fair to say that pre-GDPR there was very much hype and alarm and this amplified the closer it came to D-Day, May 25th, 2018. Things have changed somewhat since then and if, “ we understand something about the GDPR, it is this: it will be a process of dialogue that lasts many years and the dialogue will need to shift and change with technology, context, learning from evidence (including emerging case law) and evolving societal norms.”(DPC Report, 2018)

We spoke in an earlier blog, and we allude to it on this website, about some misinformation and disinformation that unfortunately increased the sense of alarm and panic pre-GDPR. After May 25th there was more.  It seems the hairdresser who cited GDPR as the reason she could not give her customer details of the hair dye she was using in her customer’s hair is the favourite GDPR myth within the office of the DPC. By the way, the hairdresser’s customer was leaving to go to another hairdresser and wanted to be able to tell the new hairdresser what colour went in her hair, but we can be sure that this had nothing to do with the hairdresser’s response!

Some Facts  From the Report.

  • 2,864 complaints, of these the largest single category was in the category ‘Access Rights’ with 977 complaints, or a little over 34%of the total.
  •  1,928 were complaints under GDPR and of these 868 had been concluded.
  •  total of 3,452 data breaches recorded with the largest single category being ‘Unauthorised Disclosures’ and 38 breaches related to 11 multi-national technology companies.
  •  almost 31,000 contacts were made to the Information and     Assessment unit within the DPC.
  • 15 statutory inquiries (investigations) were opened in relation to the compliance of multinational companies with GDPR.
  • 16 requests  –formal and voluntary- for mutual assistance from other EU data protection authorities.
  • 31 own volition inquiries under the Data Protection Act 2018 into the surveillance of citizens by the state sector, for law enforcement purposes, through the use of technologies such as CCTV, body-worn cameras, automatic number plate recognition, drones and other technologies. These inquiries are conducted by the Special Investigation Unit. This same unit continued its work in relation to the special investigation into the Public Services Card that we have featured on our website recently.
  • 950 general consultations were received, excluding the consultations with multinational technology companies.
  •  900 data protection officer notifications.

In late 2018, the DPC established an advanced technology evaluation and assessment unit (the Technology Leadership Unit – TLU) with the objective of supporting and maximising the effectiveness of the DPC’s supervision and enforcement teams in assessing risks relating to the dynamics of complex systems and technology.

So it has been a busy and productive time for the office of the DPC and they even got time to speak at over 110 events including conferences, seminars and presentations. Late last year the DPC commenced a significant project to develop a new five-year DPC regulatory strategy that will include extensive external consultation during 2019.   It has to be remembered that The DPC received complaints under two substantive parallel legal frameworks during this period:

  • complaints and potential infringements that related to, or occurred,                 before 25 May 2018, must be handled by the DPC under the framework    of the Data Protection Acts 1988 and 2003;
  • and in addition and separately, complaints received by the DPC relating to the period from 25 May 2018 must be dealt with by the DPC under the new EU legal framework of the GDPR and Law Enforcement Directive and the provisions of the Data Protection Act 2018, which give further effect to, or transpose those laws into the laws of Ireland as a Member State of the EU.

The DPC took an active part in the Global Privacy Enforcement Network (GPEN) 6th annual privacy sweep. Data protection authorities from around the world participated and the theme in 2018 was privacy accountability. Accountability is a central element of GDPR. It is a concept that, “requires organisations to take necessary steps to implement applicable data protection rules and regulations, and to be able to demonstrate how these have been incorporated into their own internal privacy programs” (DPC Report 2018).  In the last sweep GPEN aimed to assess how well organisations have implemented accountability into their own internal privacy programmes and policies. One goal was to establish a sort of baseline of an organisation’s compliance with data protection. This was the brief for the DPC, as their input was targeted at randomly selected organisations in Ireland. 30 organisations across a range of sectors completed a suite of pre-set questions relating to privacy accountability. Because the sweep was done in the last quarter of 2018 only preliminary or provisional results are available to date of report. Preliminary results include the following:

  • 86% of organisations have a contact listed for a DPO on their website
  • 75% appear to have adequate data breach policies in place
  • All organisations seem to have some kind of data protection training for staff However, only 38% could provide evidence of training for all staff including new entrants and refresher training
  • In most cases organisations appear to undertake data protection monitoring/self- assessment but not to a sufficiently high level. In this category, 3 out of 29 scored ‘poor’ , while 13 could only reach a ‘satisfactory’ level
  • 1/3 of organisations were unable to show any documented processes in place to assess risks associated with new technology and products
  • 30% of organisations failed to show they had an adequate inventory of personal data, while close to  50% failed to keep a record of data flows

These again are preliminary, and the full results will be more instructive. It is to be emphasised that 30 organisations represent a small sample size.  Nevertheless, there seems to be large deficiencies in staff training and data protection monitoring/ self- assessment. Many issues will be more fully addressed in the coming months when the results of the ‘sweep’ will be available.




Public Services Card and Biometric Data.

In our last blog, February 15th, we looked at some arguments raised in the continuing debate surrounding the public services card. There are other aspects to the debate that we will consider now. Amongst these aspects are the special categories of data that are treated differently under the GDPR than more ‘ordinary’ categories. Any general data processing rules, applicable in the case of ordinary categories of personal data, change or become redundant if the data falls within the ‘special category’ definition. Two topical data protection issues dominate this blog: the prohibition, or otherwise, on biometric data processing; and whether the public services card photograph is within the definition of ‘biometric data’.
                       GDPR and Special Categories of Data.
Art. 9 GDPR delineates the categories of data that are covered under the special category umbrella. Their treatment under GDPR differs from other categories because of the sensitive nature of the data. Biometric data, “for the purpose of uniquely identifying a natural person” is included under Art. 9 (1). Art. 9 (1) also includes a prohibition on processing of all the other special categories of data. It is difficult to understand why this prohibition has caused so much confusion and erroneous interpretation. In order to avoid doubt as to the intent, practical application and effect of Art. 9, it is prudent to first examine it in its entirety.
Recently the following appeared on the RTE website, “Article 4 of the GDPR especially says facial images are biometric data, Article 9 of the GDPR specifically says it is illegal to process biometric data. ( The reference to Art. 9 is not correct. In the first place, it does not use the word ‘illegal’, and secondly, although Art. 9 lays out a prohibition on processing of special category data that includes biometric data, it immediately sets out the exceptions to the general rule. There are many exceptions and these range from Art.9 (2) (a), through to Art. 9 (2) (j). Initially, the general rule is laid out and then the exceptions to the rule follow. Reading the text fully helps to avoid broad misstatements of fact.
                                        There are exceptions to the rule!


                                       A Taste of the Exceptions to the General Rule.

• processing where the data subject has given ‘explicit consent’ to the processing (unless where Union or MS law provide that the prohibition may not be lifted);
• processing is necessary to protect the vital interests of the data subject or of another natural person where the data subject is physically or legally incapable of giving consent ;
• processing relates to data which are manifestly made public;
• processing is ” necessary for the purposes of carrying out the obligations and exercising specific rights of the controller or of the data subject in the field of employment and social security law,  in so far as it is authorised by Union or MS law…providing for appropriate safeguards for fundamental rights…” ; (So if the Dept. was processing biometric data in relation to the data subject’s PSC, then this would be legitimate if provided for by law. Again, the prohibition is not a blanket prohibition,  as the quote from RTE website would suggest.)
• processing is necessary for the establishment, exercise or defence of legal claims or whenever courts are acting in their judicial capacity;
• processing is necessary for reasons of substantial public interest, on the basis of Union or MS law (but is proportionate, respects rights and provides safeguards).

                    Another nibble at the exceptions to the rule.

There is also an exception for processing that is necessary for the purposes of preventive or occupational medicine and where necessary for reasons of public interest in the area of public health, such as protecting against serious cross border threats to health. What all this shows is that there are numerous exceptions to the general rule. Section 73 DPA 2018 closely follows GDPR on this with S. 73 (2) providing that regulations may be made permitting the processing of special categories of data for reasons of substantial public interest. This flows from the discretion allowed to Member States under Art. 9(3). Art. 9(3) gives discretion to the member States to maintain or introduce further conditions, including limitations, with regard to the processing of genetic data, biometric data or data concerning health.

On this issue, therefore, there is only one conclusion. GDPR does not set out a blanket prohibition on the processing of biometric data. It is a prohibition that is subject to and qualified by, numerous exceptions. Prohibition on processing is waived in the situations expressly stated under Art. 9(1) and 9 (2).

Public Services Card and Biometric Data.

Biometric data is a recurring theme in the public services card debate. This debate centres around one particular feature of the card. It focuses on the photograph taken when applicants present themselves at designated offices to register for the card as part of the SAFE process. SAFE stands for Standard Authentication Framework Environment. It is a standard for establishing and verifying an individual’s identity for the purposes of accessing public services. Is this photograph biometric data? Many people take the view that this photograph is exactly that. The GDPR has laid out a position on this topic.

Art.4 (14) defines biometric data as, “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”. Section 69, DPA 2018 shares this definition except where it replaces ‘natural person’ with ‘individual’. There is a view that ordinary photographs do not constitute biometric data. It may be the case that all photographs are facial images, but not all facial images are biometric data. This is not to initiate an exercise in semantics, but there are technical differences that distinguish one from the other. GDPR has attempted to clarify the distinction by the precise nature of its text. Accordingly, it is the text itself that is most instructive in this particular context.

Verification and Identification.

An obvious purpose for biometric data is recognition of individuals and this takes two forms; identification, followed by verification. Identification is the less complicated of the two, and centres on comparing the data to that of numerous other individuals. Verification aims at matching the physical, physiological and behavioural characteristics to biometric data of a specific individual that have been stored in a database. Identification may be made with a high degree of probability. Identification answers the question, “Who are you”, whereas verification answers the question, “Are you really who you say you are”. (See diagram and note in the appendix below) Verification is made with almost 100 % certainty.

What GDPR is clear about is this; only data concerning a person’s physical, physiological or behavioural characteristics that have gone through specific technical processing allowing the unique identification or verification of a natural person, qualify as biometric data. The essence of the distinction centres on the word ’unique’. There are no degrees of uniqueness. Something is not more unique, or less unique. It can only be unique, or not unique. Therefore, identifying something as unique sets it apart from all others. Can even a mother of identical triplets uniquely identify her three children individually, from a photograph of all three together (presuming that no one has identifiable scars)? GDPR had this in mind when including the word ‘unique’, and this is because it is the specific process after a photograph is taken that enables ‘unique’ identification or verification.

A quite specific process has to be carried out before it qualifies as biometric data. Special and varied aspects of a facial image can be assessed to aid the goal of unique verification. In the context of a facial image, distances from nose to mouth , between nose and mouth, between eyes and nose and from earlobe to earlobe, are examples of, and variations on the means to the end, Unique verification is the end. On this analysis, it is difficult to perceive ordinary photos as biometric data. A photo is a facial image. On its own, and in isolation, a facial image is not biometric data. A facial image must result from, “specific, technical processing” (Art.4 (14)).

GDPR Recital 51 states, “…The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when ( our emphasis) processed through a specific technical means allowing the unique identification or authentication of a natural person”.


Finally, the Department (Employment Affairs and Social Protection) in its guide to SAFE Registration and in answer to the question, “Does the Public Services card store biometrics”, states, “ No. While the card does store a person’s photograph it does not store the biometric or arithmetic template of that photograph”.
It does not use advanced facial mapping cameras when taking the photos as part of the SAFE registration process.

APPENDIX 1 &2 from biometric blog

Public Services Card Debate Resumes.



Just when people thought the questions and concerns surrounding the Public Services Card (PSC) had been forgotten about, the debate and mystery about this card resume.  So what’s it all about?

Most of you will remember some controversy about this card at the time it was introduced, and it initially focused on one theory in relation to its introduction. For many, it represented no more than the introduction of an identity card by stealth. The government vehemently denied this, and different Ministers for Social Protection (Burton, Varadkar, and Doherty) regularly appeared in the media to explain and defend the purposes behind its introduction and certify its bona fides. It was just a convenient card with no other purposes than to cut down on benefit fraud and streamline operations. Everything now should work more cost-effectively and taxpayer money would be saved.

Nevertheless,  ‘Big Brother is watching’ theory persisted. As time moved on the card began to be scrutinised more, especially in the light of data protection legislation and amid issues of concern from that aspect, which were beginning to be raised by Digital Rights Ireland and others. Prior to the introduction of GDPR, there was an increasing awareness of the changes in data protection that were just around the corner. When GDPR came into force it was clear that now the PSC could be re-examined from a whole new perspective. Indeed, GDPR facilitated a more robust questioning of the purposes and validity of the card’s introduction. We acknowledge the enhanced powers under the GDPR are not to be applied to incidents that occurred prior to May 2018. To highlight the strengthening of these powers since the  GDPR,  our analysis is done through the lens of these changes.    When other bodies, in most cases unconnected to the granting or withdrawal of social welfare or pensions, began to insist on the card being produced to access other services, the questioning intensified. (At one point, both the Passport Office and National Driving Licence Service demanded the PSC).


The Lawful Basis for the PSC.

 Art. 6 (1) (a-f) GDPR lays out in clear terms the lawful bases that need to be established before processing personal data. In this context, the Government has repeatedly referred to the legislation that they rely on as a lawful basis.  Section 247 (c) of the Social Welfare Consolidation Act 2005, as inserted by Section  11, Social Welfare and Pensions ( Miscellaneous Provisions) Act 2013,  is most cited by officials as the legislation underpinning the PSC. However, other legislation also stands in support of the PSC and its operation.

Legislative Support for the PSC.

 Any power to issue a PSC is given under S.263 (1) of the Social Welfare  Consolidation Act 2005, which was then substituted by S. 9 (1),  Social Welfare and Pensions Act 2010. Some of the important terms in S. 9 include a reference to the information inscribed on the card and further information stored electronically on it. Section 263 of the 2005 act sets out finer details concerning the card and expressly states the minister may request the person to present themselves at a specified place, provide certain documentation, have a photograph taken and provide a signature in electronic form. It also clarifies the type of information that will be stored on the card.  This includes the person’s date of birth, gender, primary account number, the expiry date of card and card service code electronically encoded on the card and any other information that may be prescribed either inscribed or encoded on the card. Therefore, more personal data can be added to the card when the Minister sees fit.

Schedule 5 of the 2005 act gives a list of ‘specified bodies’ that may use the PSC for the purposes of a transaction. A conclusion in this regard is that unless a body is a ‘specified body’ and on this list, it cannot demand the PSC. All the information that is referenced is personal data, within the meaning of Art. 4 GDPR and, therefore, requires a lawful basis prior to any processing operation.

 So is there a lawful basis for the PSC? 

An examination of the foregoing legislation, cited in support of the legality of the PSC, would support a lawful basis for processing under Art.6 (1) (c), GDPR. There is no doubting its lawful basis under Section 2D of the earlier Data Protection Acts.    Had GDPR been in force, another lawful  basis could be found under Art. 6 (1) (e).  This is referring to processing that is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. Here there is a relationship back to the  Social Welfare Consolidation Act which places the controller (The Dept.) under the lawful obligation. One legitimate question is whether the processing is ‘necessary’ for the performance of the task or in the exercise of official authority. A public body cannot ( generally) avail of the ‘legitimate interest’ lawful basis under Art. 6 (1)  and so any ‘specified body’ under schedule 5 will have to have a lawful basis other than this basis.

It is important to bear in mind that the legitimate interest basis cannot be used to override the interests or fundamental rights and freedoms of the data subject. Undoubtedly, this includes rights under the European Convention on Human Rights and the European Charter. In the context of the PSC,   privacy rights and data protection rights are the foremost of these rights. While the PSC may have a lawful basis, there is the consideration of some other criteria to assess the validity of the PSC and its associated personal data processing. The introduction of the card would have satisfied the legal basis required under the old Data Protection Acts 1988 and 2003. It would also have satisfied GDPR, had it been in effect. Our purpose in examining through the lens of GDPR is to emphasise the lower data protection standards applicable pre- GDPR. Despite these lower standards, the PSC had no lawful basis under Section  2A of the Data Protection Acts 1998 and 2003 to process the personal data of individuals for any transactions with bodies other than DEASP.  

Some Other Important Criteria.

Articles 7 and 8 of the European Union Charter of Fundamental Rights guarantee the right to respect for private life and the right to protection of personal data respectively. Any limitation that may be imposed on the exercise of these rights must under Art.52 (1) of the Charter:

(a) be provided for by law:

(b) be necessary to meet some objectives of general interest;

(c) be proportionate.

Where a less intrusive measure can be taken to achieve the same stated objective, then the less intrusive measure must be taken. This is also in line with the data minimisation principle. Whether the card is necessary to meet an objective, such as countering social welfare fraud, is certainly debatable at least. Is it proportionate to the aims, especially when viewed through the lens of individual rights?  Are there safeguards to defend these rights? Even if the PSC passes the tests it may still not be in compliance with the GDPR.

The PSC and  Data Protection Principles under GDPR.

Even where the processing of personal data conforms to EU law, in the sense of the broader EU legal environment, and has a lawful basis that complies with Art. 6 GDPR, it still has to be in accordance with data protection principles under Art. 5 GDPR. Again, the relevant law pre GDPR is contained in the Data Protection Acts 1988 and 2003. This is where the PSC is most likely to fail to comply with the GDPR.  Art. 5 (1) states personal data, “shall be processed lawfully, fairly and in a transparent manner in relation to the data subject”. The lawfulness element refers to EU law in general and not just data protection law. ‘Fairness’ is to be interpreted as ‘proportionality’ in the application of a measure. What this means is that any measure must be appropriate for attaining the objective pursued and not go beyond what is necessary to achieve it. It is the transparency element of Art. 5 which causes most problems for the PSC. Any transparency element is best read in conjunction with Arts. 12, 13 and 14, which concern the information that has to be given to the data subject regarding the processing of the personal data.


There is no evidence of data subjects being given the required information at the point of data collection, as mandated by Art. 13. The government in 2017 published a 73-page ‘Complete Guide to SAFE Registration and the Public Services Card’. It contains valuable information but very few of the applicants for a PSC are aware of the information it contains. It is the information under Art. 13 (1) and (2) that they should be made aware of,  and it is not information they should have to seek out. A report on the PSC was recently sent to the government and is not being disclosed at present. One likelihood is that the card failed, particularly under the transparency element. Anecdotal evidence suggests most people were just told to turn up at a certain date and time, and that their photographs would be taken. No other information was offered.


Art. 5 (1) (b) is also likely to be problematic for the PSC. It states that data must be, “collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes”. Again, it directly relates back to transparency. People (i.e., the data subjects) must be made aware of the specific purposes of personal data processing. Combating welfare fraud seems somewhat unrelated to obtaining a driving licence.

Problems for the PSC are likely to surface under Art. 5 (1) (e).  Personal data shall be, “kept in a form which permits the identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed…”  So how long is necessary for the PSC?  It has been apparent for quite some time that the personal data on the PSC stays on the card and only a new photograph is required to renew it every seven years. When a card is issued to persons, it is likely that the card will be theirs for life, irrespective of any need to access services in the future. Again, any information regarding retention periods is not being conveyed to PSC applicants.

There is no doubting the failure of the PSC card regime to meet the current transparency standards of the GDPR. It would also fail the standards of the Data Protection Acts 1988 and 2003. The office of the DPC is completing a report on the PSC that pre-dates the GDPR. This report will only focus on the law applicable at the time.  Ultimately, conclusions in that report would be very different had GDPR been in effect when the report was commenced. Looking at it in the light of GDPR, as we have done, focuses on inherent weaknesses and flaws when judged under the higher GDPR standards. Any complaints post- GDPR, are now judged by these higher standards.

In the next blog, we will discuss the PSC from the ‘Special Categories’ perspective and focus on the Biometric data dimension.

Patrick Rowland,

Data protection consultants, GDPRXpert, are based in Carlow/Kilkenny and Mayo, offering a nationwide service.

Visit for more information.





Latest News