Covid -19 pandemic creates difficulties for many.

The Covid-19 pandemic has created difficulties for many, especially employees and employers. Many business owners have not been able to continue paying their employees. This has resulted in the laying off of many employees. For employees, apart from anxiety over their own health and despite mortgage moratoriums et cetera, this has created financial difficulties. For employers, and especially SMEs the pandemic has the potential to deal a death blow to a business that took years to build up.

As noted in a previous blog, when set in this backdrop, data protection concerns seem trivial. Nevertheless, just as fundamental rights and freedoms cannot be trampled on in a health crisis, neither can data protection rights. Indeed, because more sensitive categories of personal data are now being processed (health data, particularly)), more care should be taken to ensure that data protection rights under the GDPR are being respected and enforced.

There must be at least one legal basis to process data and all principles must be abided by.  What is often forgotten is that even where derogations from the GDPR apply, the principles must still be respected and applied in any personal data processing operation.  While the rules should be obeyed even under extreme circumstances, these same data protection rules (such as the GDPR) do not hinder measures taken in the fight against the Coronavirus pandemic.  It is conceivable that in times of emergency such as now, some data protection rules may be relaxed but it is unlikely they will ever be suspended or waived. Still, there have been many questions to GDPRXpert from clients unsure of aspects of GDPR, especially in the specific context of this pandemic. Covid-19 pandemic creates difficulties for many.  At this time, we will take a look at some of the most common questions we have been asked.

Question 1.

I have many of my employees working from home at least temporarily. Are there any special precautions employers need to take in relation to personal data?


Many people work from home, but clearly these numbers have increased since the pandemic. The first thing that those working from home must do from the outset is create the mindset that they are still working in the office. Remember, it is not feasible for employers to go and assess the suitability, or otherwise, of all ‘work from home locations’ (WFHL),so some basic and normal ground rules need to be emphasised. Employees must secure their data just as if they would in the office. To do this they must take the normal precautions, just as if they were still working at the place of employment.  It is paramount they don’t allow family members, or anyone else, to just walk in to where they have set themselves up. For example, they should never leave personal data on view on a computer screen. Data protection consultants GDPRXpert frequently remind a client that is often the small oversight or lack of attention that leads to data being compromised. Employees should log off when leaving their work station or lock an area if too many people are coming and going. Working from a laptop on a couch is not a good idea if y sharing an apartment or house with others! There should be strict controls on the ability to download personal data from an organisation’s system files.

If no relevant data protection policies are in place, now is the opportune time to enact some to govern how company assets and information can be accessed, where information can be stored, and how information can be transmitted. Employees must be quickly made aware of, and become competent about, the types of information considered to be confidential, trade secret, or otherwise protected. There is much anecdotal evidence of an upsurge in phishing attacks.

In the US there has been a huge rise in fraud schemes related to Covid-19, with many businesses receiving fake e-mails purportedly from the Centre for Disease Control (CDC). These emails contain malicious attachments so employees at WFHL need to be extra vigilant. In all cases these fraudsters are attempting to have their targets access and verify personal information or credentials. Employers must train their employees on how to detect and handle such scams and keep them informed about the latest threats. It is a good idea to have regular video conferencing with staff to facilitate Q&A sessions and update everyone on the latest threats. It also helps staff morale.

Only those whose essential job duties place them in the ‘need to know’ employee classification should have access to ‘special category data’, which includes health data. It is best practice to carefully review any Bring Your Own Device (BYOD) agreements, if any are in place between you and employees. In this scenario, and where special category data are being processed, it is vital that all information is encrypted in transit and while at rest on the device.  For example, many in the healthcare field are now working remotely and collecting health data. In the absence of special arrangements these remote employees should be utilizing company-issued equipment and not saving company data to personal laptops, flash drives, or personal cloud storage services such as Google Drive.

It is true to say that the risks for the employer are numerous, so all care should be taken in relation to BYOD agreements. Any employer should seek to ensure that those practices do not compromise the security of, and your right of access to, your information and data, and that your policies comply with all attendant legal obligations.  In the conventional office working setting it is easy to have a quick word in an employee’s ear if an employer becomes aware of any breach of, or indiscretion concerning, a BYOD agreement. It is more complicated when employees are working remotely. Best and safe practice is  for employers to  consider periodic reminders of the BYOD policy and offer training sessions, as well as ongoing education regarding the importance of protecting the employer’s trade secrets, confidential and proprietary information and data. There should be strict controls on the ability to download personal data from an organisation’s system files.


“There is no questioning the advantages of BYOD agreements. It is a growing trend, one that may already be occurring at your company. Employers are implementing policies and practices that permit, or even require, their employees to use their personal electronic devices (e.g., laptops and smart phones) and data services (e.g., backup and file-sharing software) for work-related purposes.  The appeal of such Bring-Your-Own-Device (BYOD) practices for both employers and employees is undeniable. Employers avoid the up-front costs and administrative hassle of purchasing laptops and smart phones as well as employees’ demands for the latest and greatest gadgets, and employees do not have to carry around multiple devices. Overall, this is a much simpler and more efficient way of doing business, right?”(Elaine Harwell, Senior Counsel, Procopio).  There are security considerations nevertheless, and here are some aspects that demand careful attention.


Your BYOD policy should cover a broad range of topics, including:

  • Which employees are permitted to use personal devices for work purposes;
  • Acceptable and unacceptable use of personal devices for work purposes;
  • Your ownership of and right of access to all employer data on employees’ personal devices and employees’ lack of privacy rights in that data;
  • Your security and data protection protocols;
  • Your employees’ obligations with respect to maintaining the security of employer data (e.g., a provision requiring employees to protect all devices that contain employer data with a password or PIN);
  • A disclaimer that the employer is not responsible for the security of the employee’s personal data;
  • Reimbursement for the employee’s use of his or her personal devices; and
  • Rules and/or restrictions regarding work-related use of personal devices outside of working hours.

Question 2.

Can an employer let employees know the identity of a co-worker who has contracted Covid19?


We know that personal data includes an identifier such as a name.  Processing includes inter alia, “…disclosure by transmission, dissemination or otherwise making available…” Therefore, sharing the name of an employee who has contracted Covid-19 constitutes personal data processing. ‘Data concerning health’ under Art.4 GDPR includes any personal data related to the physical or mental health of a natural person …which reveal information about his/her health status. In this instance we have an employee’s name, which is ‘ordinary’ personal data, and data concerning health, which falls under ‘special category data’ under Art.9 GDPR. Processing rules vary depending on the categorisation of the data involved. The legal bases for processing also differ, again depending on the category of the data.

In line with the confidentiality principle, the general rule is that the identity of an affected employee should not be disclosed to his/her colleagues or any other third parties without some legal basis or very strong justification. Having been informed by previous experiences we know that the smaller the business is, the more easily the identity of the co-worker will become known. Even in larger companies a person’s absence will be noticed and lead to unhelpful speculation, much of it on social media, as to who exactly has the virus.  This speculation would be upsetting for those wrongly identified as having Covid 19. It is usually not necessary, and often will not serve a legitimate purpose to disclose the identity of an employee with Covid 19. Employers are under a legal obligation to ensure the health and safety of employees Safety, Health and Welfare at Work Act 2005 ). Informing employees of an infectious disease in the workplace would be a statutory duty (also a common law duty with an attached duty of care). Indeed, employers should carry out a risk assessment to identify the risks of a coronavirus outbreak at work, and implement steps to minimise that risk. That said, (even in the absence of obligations under  health and safety legislation) it would be expected that employees would be informed of any case of Covid 19 in a work setting in order that staff could self isolate or work from home.

Any information disclosed should always be limited to the minimum necessary for a specific purpose. Someone’s identity, normally and generally, should be disclosed only where absolutely necessary and on a strict need to know basis. As evident from a notice by the DPC the key word may be ‘generally’. “Any data that is processed must be treated in a confidential manner i.e. any communications to staff about the possible presence of coronavirus in the workplace should not generally identify any individual employees.”  The DPC also states that “the identity of affected individuals should not be disclosed to any third parties or to their colleagues without a clear justification.”We note it does not state ‘without a clear legal basis under GDPR’. There is a world of difference between the two.  Any test of what is ‘clear justification’ either does not exist, or is a subjective test. Who decides what a ‘clear justification’ is? Does a justification have to be set within a legal basis?  The ultimate arbiter on this is the CJEU.  It is a facile exercise to set out a justification for an action, rather than ground it on a legal basis.

From a practical perspective, to allay fears amongst all employees who are wondering how close their contact was with the infected employee, a common sense approach would be to ascertain whether the infected employee would consent to his identity being made known to his/her co-workers, with the aim of more effectively safeguarding those co-workers. For example, if a worker in a very large manufacturing plant became infected it would cause undue stress to many employees if no other information was forthcoming from the employer. Employees will worry and wonder about how close they were to the infected individual. If an employer is too specific about the area of the plant where the infected employee worked, it may be tantamount to naming the individual. The circumstances and details of any particular case will determine the nature and quality of the dilemma facing the employer.


There is no avoiding the reality that not knowing who exactly in your place of employment has contracted Covid-19 will cause undue stress on that person’s co-workers.  As noted many times, data protection rights under the GDPR, and data protection and privacy rights under the Charter and the European Convention on Human Rights respectively, involve a balancing exercise with other rights. In cases like the present one, the unprecedented circumstances involved in the whole scenario suggest to us that a common-sense approach is an option that many will consider.  It is an approach that carries some risk. In normal circumstances a person’s identity should not be disclosed, but in very extreme situations, such as the present one, a justifiable case could be made for releasing a person’s identity.

This action is still fraught with danger, and if an employee files a complaint it will be up to the DPC at first instance to give a decision. An employer’s justification in releasing the identity of the coronavirus victim may not withstand scrutiny by the DPC.  The best advice is not to release a person’s identity unless you have obtained explicit written consent from the employee. Where explicit consent is not forthcoming our advice would be to state that a co-worker, who cannot be named at this time, has contracted covid-19. How much more information is conveyed to co-workers is dependent upon the particular, and possibly unique, circumstances of an individual situation.

There will be cases   where, for example, an employer will conclude that the health and safety of all employees is best served by disclosing the identity of the employee with Covid-19. In such a situation, and because of the statutory duty on the employer by virtue of health and safety, there is at least an arguable case. Remember, although set in a different work context, ‘the indications of impending harm to health arising from stress at work must colleagues may be infected, but they should only reveal their names if national law allows it; if they can justify that such a step is necessary: and only after the affected workers have been be plain enough for any reasonable employer to realise he/she should do something about it’. (Hatton v Sutherland [2002] 2 All E.R. 1)

Ultimately, the roadblock may be formed by the twin concepts of ‘necessity’ and ‘proportionality’ that permeate through the GDPR and EU law.Views on the issue are by no means unanimous across the EU. A most recent guidance note from the European Data Protection Board says ‘employers should inform staff that colleagues may be infected but they should only reveal their names if national law allows it if they can justify that such a step is necessary; and only after the affected workers have been informed/consulted beforehand.’ Earlier we saw the slightly differing view from the DPC guidance. The U.K. ICO also takes a slightly different view. “You should keep staff informed about cases in your organisation. Remember, you probably don’t need to name individuals  and you shouldn’t provide more information than necessary. You have an obligation to ensure the health and safety of your employees as well as a duty of care. Data protection doesn’t prevent you doing this.” The identity of affected individuals must not be disclosed to their colleagues or third parties without a clear justification.

The Appropriate Lawful Bases.

The HSE and other public health authorities would be seeking details concerning any Covid- 19 case in any context. Certain information is always needed so that authorities can effectively carry out their functions. Only recently Covid-19 was declared a ‘notifiable’ infectious disease under recent legislation. Medical doctors are mandated to report cases to the Medical Officer under the Infectious Diseases (Amendment) Regulations 2020. There is no equivalent legislation covering employers. Strangely, employers are not mandated to report infectious diseases to the Health and Safety Authority. Employees under the 2005 ct are mandated to report to their employer or the employer’s nominated registered medical practitioner if they become aware of any disease which affects their performance of work activities that could give rise to risks for the health , safety, and welfare of others at work. A clear duty is imposed on all employees to protect themselves and others. However, employers under the 2005 Act are under a legal obligation to protect employees from issues that affect their health and safety, in a negative manner. Clearly, this could easily be construed to include the novel coronavirus. This could act as a lawful basis for processing personal data.

Processing could also be justified on the basis of Art.6 (1(d) that it is ‘necessary to protect the vital interests of the individual data subject (employee) or other persons (other employees or other people). An employer could also find a legal basis for processing the personal data under Art.6 (1) (f) GDPR where “processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party…”  Where an employer relies on this legal basis, he/she should document the ‘legitimate interests assessment’ that has been made.

In certain cases the person’s identity will be needed. For example, authorities may need to interview the employee who has contracted the disease. Recital 46 GDPR states “some types of processing may serve both important grounds of public interest (lawful under Art.6 (1) (e) ) and the vital interests of the data subject (Art.6(1)(d)), as for instance where processing is necessary for humanitarian purposes, including for monitoring epidemics and their spread…”  Where the employer shares information, the sharing should be in compliance with GDPR and, most especially, the principles. In many cases employees themselves may fully consent to having their identities made known or they will make it known themselves.  If so, in those cases the personal data will have been ‘manifestly made public’.

It is questionable whether the consent of an employee to processing of his /her own personal data would constitute valid consent. It has not been definitively set out in the context of the employer/employee relationship but Recital 43 makes it clear consent is not a valid legal ground “where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority…” GDPRXpert has not found any case law to support the view that an employer/employee relationship would satisfy the ‘clear imbalance test’. Undoubtedly, the average employee could feel pressurised into giving consent.  It is something that will fall for future decision on a case by case basis. What is noteworthy is that the reference to a clear imbalance in the context of an employment relationship, which had been included in an earlier draft of the GDPR, was deleted in the enacted regulation.

Health Data Processing

Where data concerning health are involved, the situation changes. As we know there is a general prohibition on the processing of ‘special category’ data, which includes data concerning health. There are a number of exceptions to this broad prohibition, including under Art.9 (2) GDPR and sections of the DPA 2018. These provide potential legal bases for processing health data for the purposes of Covid-19 containment. S.46 DPA 2018 and Art.9(2)(b) permit the processing of health data where necessary and proportionate for the purposes of exercising or performing any right or obligation under Irish employment law – employers are legally obliged to ensure the safety, health and welfare at work of their employees. Specific measures to safeguard the fundamental rights and interests of the data subject (employee) must be taken.

Perhaps the most appropriate legal basis for processing health data is found under Art.9(2)(i) GDPR and s.53 DPA 2018, both of which provide exceptions to the general rule. Here the processing is deemed necessary for reasons of public interest in the area of public health such as protecting against cross border threats to health. Both must be underpinned by law (EU/Member State) providing suitable and specific measures to safeguard rights and freedoms of the data subject (employee). Examples of suitable safeguards would be limitation on access to the data, strict time limits for erasure, and other measures such as adequate staff training to protect the data protection rights of individuals.

S.52 DPA 2018 and Art.9(2)(h) GDPR also offer a sound legal basis as both provide, inter alia, for processing for the purposes of preventative or occupational medicine, and for assessment of the working capacity of an employee. Necessity and proportionality are always underlying considerations.

Question 3.

Can employers ask for travel and medical information from employees and from visitors coming to the workplaces of employers?


Employers as we noted earlier are under a legal obligation to protect the health of their employees and to maintain a safe place of work. (Safety, Health and Welfare Act, 2005). There would be justification for employers asking employees and visitors about recent travel, in their efforts to prevent or contain the spread of Covid-19 in the workplace. This would be especially so, where they are worried about any possible travel to Covid-19 hotspots. Employers have a legal obligation to protect the health of their employees and maintain a safe place of work. In this regard, employers would be justified in asking employees and visitors to inform them if they have visited an affected area and/or are experiencing symptoms. If travel has taken place as part of an employee’s duties then those details are known already to an employer. The question then becomes one of asking about personal travel destinations and the presence of any Covid-19 symptoms.

In Ireland the DPC has given recommendations on Covid-19 and these support the view that it is reasonable to ask an employee such questions. Implementation of more stringent requirements, such as a questionnaire, would have to have a strong justification based on necessity and proportionality and on an assessment of risk. It is advisable to be sensible when asking employees to provide personal information about their likelihood of risk and not to ask for more than you genuinely need.

Out of the 28 national data protection authorities of European Union member states, some 20 EU countries have issued specific guidance regarding COVID-19 and data protection so far. We are beginning to see several core principles emerge from this guidance:

  1. COVID-19 sensitive personal data, such as medical symptoms and diagnosis, travel history, and contacts with those who have been diagnosed can be processed on the basis of safeguarding public health.
  2. The fact that an employee has tested positive for COVID-19 can be disclosed, but identifying information about the individual, in particular the individual’s name, should not be disclosed.
  3. European DPAs have scrutinized if not discouraged or prohibited mass surveillance techniques by data controllers, such as use of questionnaires or temperature checks, other than those performed by health authorities.
  4. Security measures must still be implemented to protect COVID-19 personal data.

What the foregoing has shown is that some issues around data protection in the context of the Covid-19 pandemic are complicated. The coronavirus pandemic has brought forth evidence of how interpretations of some articles in the GDPR vary within jurisdictions. Member states (MS) have been given some latitude in making changes and additions to the GDPR, but Covid-19 has exposed a lack of consistency in interpretation of portions of the GDPR across the EU. This is something we will look at closely in the future, and as the pandemic expands in a potentially lethal manner globally.

Patrick Rowland,

We are GDPR and Data Protection consultants with bases in Carlow/Kilkenny and Mayo, offering a nationwide service.

For more details visit

The ongoing Covid-19 raises life and death questions.

The ongoing Covid-19 pandemic raises life and death questions all over the globe. Data protection concerns in this context appear trivial and insignificant. As the DPC has stated, “data protection law does not stand in the way of the provision of healthcare and the management of public health issues; nevertheless there are important considerations which should be taken into account when handling personal data in these contexts, particularly health and other sensitive data”. Nevertheless, some questions raised will remain as contentious issues long after Covid-19 has been clinically controlled. There is no need to remind anyone how the pandemic raises life and death questions. People are getting sick and many people are dying .
Identified and Identifiable

There was much debate and controversy surrounding the lack of specific geographic details provided by health officials in relation to confirmed cases of Covid-19. One view was that the GDPR was being cited as a reason not to provide more precise details as this could lead to someone’s identity being disclosed. Remember that from Art.4 (1) “‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name…location data…or to one or more factors specific to the physical…cultural or social identity of that natural person;”.It is well established that non- disclosure of a person’s name, in and of itself, is not always a guarantee of anonymity.
Even when special category data processing is allowed by derogation, it does not mean there is any derogation from the applicability of data protection principles. On the contrary, these are always applicable. In fact, it is especially important to abide by all data protection principles in relation to these more ‘sensitive categories’ of data, to use the term from the old Data Protection Acts. Unquestionably, a person’s identity could quickly become public knowledge if a very precise geographical location was provided by the public health authorities. It would not be long before people would be able, by a process of elimination and observation, (hopefully, not surveillance!) to identify people in a particular area that were in isolation because they had tested positive for the virus or had been in recent contact with someone who had tested positive.
                                Location data

Although the health authorities are doing their best to ensure a person does not become identifiable, the possibility of this happening increases directly as the information disclosed becomes more detailed .Take the case of the un-named school that was closed because students had recently returned from a school trip to Northern Italy. Health authorities consistently refused to name the school, despite the fact it had been immediately identified on social media. Their policy of non-disclosure was rendered meaningless.

The truth is this type of information becomes public knowledge very quickly. In the midst of a pandemic people are understandably more inquisitive, and it is quite possible anyone who was on the school trip to Italy and succumbed to the virus would be identified in a timely manner. What is clear so far is that the health authorities are determined to keep information to a minimum so that precise geographical locations are not revealed. This is why we have been hearing about a case in the South or the East etc. but no towns or cities had, at least initially, been named. Some politicians prefer more specifics on locations of so-called clusters of infection.
Different views or rationales can be taken of this policy approach. One view is that naming the location precisely might, in combination with other information available to local residents, make an individual or individuals readily identifiable. This could cause panic to people in the immediate region and distress to patients and their families. Another view is that if the precise location was given, then residents in proximity to that area might be on higher alert leading to greater caution in their personal social interactions. The policy has been defended on other grounds by Dr. Tony Holohan. It is seen as designed to protect the privacy of individuals on the basis that people are less likely to come forward if they fear their identity will be made known. This would be another hurdle in the race to quantify and track the extent of the pandemic.

For the public interest/of public interest 

All views have their merits but any view carries an underlying interpretation of what is ‘for the public interest’. Undoubtedly, the question is a subjective one, and in instances such as a public health emergency caused by a pandemic, what constitutes the “public interest” is properly evaluated by the health authorities and the government. Within this context, under Art.9 (2)(h), Art.9(2)(i) and S.53 DPA 2018 lie the specific exceptions to the general prohibition on processing of special categories of personal data, which includes health data. Derogations from the general prohibition are allowed, but subject to suitable safeguards having been put in place to safeguard fundamental rights and freedoms of the data subjects. Such safeguards may include limitation on access to the data, strict time limits for erasure, and other measures such as adequate staff training to protect the data protection rights of individuals.

There are many lawful bases for processing personal data. Consent is one of them but it is by no means the strongest. It can be withdrawn at any time. Indeed, the GDPR provides for the legal grounds to enable competent public health authorities (and employers) to process personal data in the context of epidemics, without the need to obtain the consent of the data subject. This applies for instance when the processing of personal data is necessary for the employers for reasons of public interest in the area of public health or to protect vital interests (Art. 6 and 9 of the GDPR) or to comply with another legal obligation.

A valid distinction needs to be made at the outset between what is “for the public interest” and what is “of interest to the public”. I am fairly certain that it was Minister Simon Harris who was recently criticised for making a distinction between the two, but he was correct in his assessment. What he was trying to explain was that because some information is ‘of interest to the public’ does not mean its disclosure is made legitimate or justifiable by a motive of public interest. Would disclosing the information do more harm than good? It has elements of the harm principle of the utilitarian philosophy espoused especially by J.S. Mills and Jeremy Bentham. In essence, the lesser harm for the greater good. The courts and many statutes frequently refer to the public interest but “ there is no single satisfactory definition of what the public interest is”.( See, Kelleher, Privacy and Data Protection Law in Ireland, 2nd ed. at p.175) It might be more incisive to simply ask what is in the best interests of the public at large.

In the context of a Freedom of Information case in an Australian Federal Court, Justice Brian Tamberlin wrote the following:
The public interest is not one homogenous undivided concept. It will often be multi-faceted and the decision-maker will have to consider and evaluate the relative weight of these facets before reaching a final conclusion as to where the public interest resides. This ultimate evaluation of the public interest will involve a determination of what are the relevant facets of the public interest that are competing and the comparative importance that ought to be given to them so that “the public interest” can be ascertained and served. In some circumstances, one or more considerations will be of such overriding significance that they will prevail over all others. In other circumstances, the competing considerations will be more finely balanced so that the outcome is not so clearly predictable. For example, in some contexts, interests such as public health, national security, anti-terrorism, defence or international obligations may be of overriding significance when compared with other considerations.( McKinnon v Secretary, Dept. of Treasury [2005]FCAFC)

The term eludes precise definition but at its core is concern with the welfare or well-being of the general public and society. Data protection law and GDPR have often to be balanced against other rights such as freedom of expression. Today we are seeing with Covid-19 government actions how the public interest motive in the area of public health far outweighs personal rights and freedoms. What is, or indeed what is not, in the public interest often depends on the context in which it is being examined.

Mr Justice Barrett in Dublin Waterworld v National Sports Campus Development Authority [2014] IEHC 518(7 Nov 204) stated, “disputes are likely to be of interest to the public but that does not make their resolution a matter of public interest”. S.53 DPA 2018 uses the terms “for public interest reasons in the area of public health including… ”  The terminology of Art. 9(2)(i) is similar and refers to “processing necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health…”
Processing of special categories of personal data including health data is clearly permissible under GDPR and the DPA 2018. “Processing” under the GDPR includes, amongst others, “dissemination”, but this does not mean it is permissible to freely share the information with the general public. Dissemination, as a form of processing must itself follow the data protection principles and respect, amongst others, the principle of purpose limitation.

If the personal data is initially collected (processed) for the public interest in the area of health, is the dissemination (for example, through the coronavirus daily briefings) in line with this original purpose? It is likely that the answer is yes. The best informed view is that the dissemination just represents another type or form of processing and the purpose remains the same. Anyway, Art. 6(4) GDPR allows for a ‘compatibility (of purpose) test’, in situations where the processing is for a purpose other than that for which the data have been collected and is not based on consent or Union or Member State law. The concept of “public interest” at general law is wide –ranging and expansive. A classic dictum is that is of Lord Hailsham that “the categories of public interest are not closed” (D v National Society for the Prevention of Cruelty to Children [1978] AC 171 at 230)

There are… several different features and facets of interest which form the public interest. On the other hand, in the daily affairs of the community events occur which attract public attention. Such events of interest to the public may or may not be ones which are for the benefit of the community; it follows that such form of interest per se is not a facet of the public interest  (DPP v Smith [1991) 1 VR 63 at 75).

The public interest is not the same as that which may be of interest to the public. We have seen in many previous blogs how data protection rights do not exist in isolation, nor do they trump other rights. At any time the Government can decide to be more forthcoming and more specific with information concerning Covid -19. The deciding factor will be whether it is in the public interest to do so. If that time ever comes the government will still be mindful of the obligation to protect the anonymity of any individual who may have contracted the infection.
In an upcoming blog we will share common data protection concerns in the context of the coronavirus that have been raised by many of our clients through our website.

Patrick Rowland,
We are GDPR and data protection consultants with bases in Carlow/Kilkenny and Mayo, offering a nationwide service.
For more details visit

GDPR Hasn’t Gone Away.

The GDPR hasn’t gone away. In fact, the truth is that it is really just getting started as regulators, not all, become more assured in their own compliance policies and strategies.

In previous blog posts we looked at the first annual report from the DPC since the GDPR was introduced in May 2018. Following on from that, we did an evaluation of the effectiveness of the GDPR which assessed its effectiveness about 15 months past its inception. The GDPR has once again been the subject of debate recently, with this debate emanating from the perspective of enforcement. Most notably, there has been harsh criticism of the Irish DPC because of a perceived reticence to impose fines. Whether this is justified is examined below.

Some Quick Stats

The most up to date information on the application of the GDPR throughout the EU/EEA provides the following:

  • More than 6,700 data breaches were notified to Ireland’s Data Protection Commission (DPC) last year, the second highest level of notifications recorded per capita across Europe.
  • Since its implementation in May 2018, the General Data Protection Regulation (GDPR) led to over 160,000 data breach notifications across Europe, according to research from multinational law firm DLA Piper.
  • From this total of 160,000, there were about 100,000 reported for 2019. A recent report by DLA Piper showed the Netherlands topped the table with 40,647 data breach notifications reported. The same country had a per capita ratio of 147.2 per 100,000.
  • Ireland had a per capita ratio of 132.52 per 100,000, ranking second in the table followed by Denmark.
  • European regulators have imposed €114 million in fines (for data breaches) under the GDPR regime to date, with a further €329 million in sanctions threatened. ( See, ‘Ireland ranked second in Europe for data breach notifications’ )



Of most interest to data protection professionals is the type and amount of fines that have been issued to date. In this context it is enlightening to remember that the Irish DPC is the lead regulator for many companies such as Google, Twitter, Facebook, Microsoft, and others. This is in part due to the ‘one stop shop ‘mechanism introduced under the GDPR. Based on the figures for data breach notifications shown above it would be expected that the Irish DPC would have issued numerous fines at this point in time.

New figures compiled by the Italian data protection body Osservatarorio di Federprivacy – which includes data from official sources in 30 countries – show authorities in the EU/EEA imposed 190 fines in 2020.Italy was the most active data protection authority, with 30 actions last year, while the UK was the most punitive, with fines totalling €312 million, some 76 per cent of all sanctions issued. Among the companies to be facing fines are British Airways and Marriot, which are looking at bills totalling £183 million (€214.8 million) and £99 million respectively after being sanctioned by the UK’s Information Commissioner’s Office last year.

Only Ireland and Italy failed to impose any fines. On its face, a failure to impose fines is disconcerting and raises questions about the practical operation of the GDPR.   One of the partners at DLA Piper who specialises in cyber security and data protection, suggested fines have been low relative to “potential maximum fines” of €20 million ($22.2 million) or 4% of annual global turnover, “indicating that we are still in the early days of enforcement.“We expect to see momentum build with more multi-million Euro fines being imposed over the coming year as regulators ramp up their enforcement activity.”

More on Fines

While Ireland’s DPC has failed to fine anyone, the French regulator has seen fit to fine Google €50million for failing to comply with GDPR obligations. Indeed, the French top the rankings for the level of fines imposed (€51mill.), followed by the Germans (€24.5 mill.), and the Austrians (€18mill.). There is no questioning the ability of the DPC to issue fines, but some are beginning to question a willingness to issue fines. In particular, the Italian regulator has taken the opportunity to level some criticism at the perceived lack of action by the DPC in Ireland. That regulator has tabulated figures, which include data from official sources in 30 countries, showing authorities in the EU/EEA imposed 190 fines in 2019.

Italy itself was the most active data protection authority, with 30 actions last year, even though it was one of the lowest in terms of breach notification numbers. The UK was identified as the most punitive with fines totalling €312million representing 76% of all sanctions meted out. Federprivacy chairman Nicola Bernardi said the failure of the Irish Data Protection Commission to issue fines thus far is a concern given the large number of leading tech companies based here. He expressed concerns that technology companies may be treated with more leniency in Ireland than in other jurisdictions and called for greater consistency to be applied across the EU for dealing with sanctions.

                                      So is the criticism justified?

The Irish DPC has 61 statutory enquiries under way, 21 of which are focused on tech multi-national firms. These include Facebook (8), Twitter (3), Apple (3), Google (1) and LinkedIn (1). (See, ‘Data Breaches in Ireland among highest in EU’. Adrian Weckler, Irish Independent, Jan. 20, 2020.) Informed sources have said the Data Protection Commission is in the final stages of its investigation into WhatsApp over possible breaches of EU data privacy rules, with a draft decision expected to be circulated to other authorities to consider within weeks. This is the first of the commission’s many investigations to approach its end point, with delays blamed on complications that arise from pursuing companies that operate cross-border. Verdicts are expected in the Twitter and Whatsapp cases very soon, according to DPC officials. Helen Dixon has distanced herself from any speculation on the amount of fines that may be imposed, while stating that the recent fine of $5billion levied on Facebook in the U.S. by the FTC is unlikely to be repeated here.

 In defence.


What is clear to informed data protection professionals such as GDPRXpert is that there are extenuating circumstances that explain the non –imposition of fines to date by the DPC. Undoubtedly, a major contributory factor in non-imposition of fines so far has been the volume and complexity of current investigations. Both of these factors have combined to delay the final verdict. Until there is a final verdict rendered, there can be no announcement of any fine. So any criticism must take account of the quantity, the nature, and the attendant quality of investigations that are still incomplete. As noted earlier, the cross- border nature of many of the investigations adds to the complexity. These particular investigations just take time. Every investigation has to be placed within its own particular context. Going back to the stats on breach notification, we saw that Denmark placed third in the table for breach notifications.  This needs to be viewed in a manner detached from its apparent face value.


Many of the breach notifications are related to sending the information of one data subject to the wrong recipient, often in an otherwise secure manner, so the majority of breaches are not severe. It is all too easy to make general assumptions from bare statistics or numbers. Context is crucial to a true understanding. Commenting on the country’s top-three position in the GDPR index, Allan Frank, an ICT security specialist at Datatilsynet, Denmark’s data protection regulator, said: “We don’t see Denmark as more prone to cyber-attack.”Instead, Frank said, the country’s public and private sectors were accustomed to “reporting to public authorities in different matters” – including data breaches – through a single web portal.

Earlier in the blog we saw that France had imposed the highest amount in fines, (almost entirely coming from the Google fine) but yet had a very low ranking for the number of breach notifications per capita. There is no direct relationship between breach notifications and the imposition of fines. It has more to do with the nature of a breach and the particular type of breach. There is no automatic fine for merely communicating a breach. What is more salient is whether there was an outright infringement of the regulation that caused the data breach.


“The investigation of cross-border issues is highly complex and takes time to complete, highlighted by the fact that there have been very few decisions with fines issued under the GDPR in relation to cross-border investigations across all 28 EU supervisory authorities since the application of the GDPR in May 2018,” said deputy commissioner Graham Doyle. In principle, regulators can impose fines of 2% or, in some cases 4%, of global turnover. In practice, they will have to judge whether such a heavy penalty would stand up in court, said DLA Piper partner Ross McKean. It’s going to take time – the regulators are going to be wary about going to 4% because they are going to get appealed,” McKean told Reuters. “And you lose credibility as a regulator if you’re blown up on appeal”. Therefore, it seems logical and represents good practice on the part of the DPC to complete the full investigative process before any discussion in relation to fines is broached.

What we are likely to witness in the future will be fines being assessed more quickly in the light of the degree of severity of the failure to comply with obligations under the GDPR. Data breach notifications are often the beginning of the fine process. GDPR was aiming at this from the outset. This is reflected in the framework of the Regulation. For example, Art. 83 sets out the appropriate maximum for fines, based on the nature of the infringement. It establishes a sort of hierarchy of infringements. The overriding factor is that the fines be ‘effective, proportionate and dissuasive’.

Art. 82.3 (a-k) lays out factors and conditions to be considered when making the assessment on the need for, or the appropriate amount of, a fine, if one is to be imposed.  These are categorically delineated and leave few questions. Any fine can be imposed instead of, or in addition to, corrective  measures referred to in points (a) to (h) and (j) of Art.58(2). Art.83 (5) lays down the upper limits of fines for infringement of certain provisions of the regulation. Any non compliance with an order of the DPC can also be subject to a maximum fine of €2million or in the case of an undertaking 4% of total worldwide turnover in the preceding year, whichever is greater.

In relation to the cases before the DPC currently, it is only proper and prudent to leave no stone unturned in any investigation, especially bearing in mind the substantial quantum of fines, for which undertakings in particular may be liable. Informed sources have said the Data Protection Commission is in the final stages of its investigation into WhatsApp over possible breaches of EU data privacy rules, with a draft decision expected to be circulated to other authorities to consider within weeks. “This is the first of the commission’s many investigations to approach its end point with delays blamed on complications that arise from pursuing companies that operate cross-border.”( Charlie Taylor, Irish Times, 20 Jan. 2020, ‘Ireland ranked second in Europe for data breach notifications’)

It seems to GDPRXpert that the DPC is in a kind of ‘no win’ situation. Had the DPC left the ‘big fish’ until later and gone after the ‘smaller fish’, (smaller companies and SMEs) criticism would have been relentless from vested interests. A popular view would have held the DPC lacked the will to challenge Google, Facebook, Apple etc. Yet the DPC was not going on fishing expeditions with the investigations that commenced. There were valid reasons, many stemming from data breach notifications.

Nevertheless, there is a view that holds that the DPC should have had a mixture of investigations in the early days of the GDPR. This would have sent out the message that GDPR compliance is expected from all, not just the ‘big boys’. There is validity to that view, with strong anecdotal evidence suggesting smaller businesses, in particular, have not been giving the GDPR the attention it demands. Many feel the DPC is busy elsewhere. That may be true for now. What may be lost in all of this is that, if a business comes to the attention of the DPC through a data breach, that business will be expected to show what exactly they have done since May 2018! There will be no excuses for a failure to be in compliance so long past the introduction of the GDPR.

Patrick Rowland,

We are GDPR and Data Protection consultants with bases in Carlow/Kilkenny and Mayo, offering a nationwide service.

For more details visit


Public Accounts Committee’s Request for Information and GDPR

Last year the Public Accounts Committee sent a request for information to the Dept.of Finance in relation to fees charged to that department by barristers.
In a previous blog, data protection consultants GDPRXpert discussed examples of how the GDPR was used as an excuse for not supplying information, in situations where supplying the information was perfectly legitimate. Some examples showed how ill-informed people were, while others belonged at the farcical and ludicrous end of the spectrum. What we are examining today lies at the more nuanced end. Legitimate positions can be taken by both sides but to repeat what we have stated previously, the GDPR does not exist in isolation. Rather, it is about balancing rights and proportionality. Remember the removal of the visitor books from the heritage sites? If you wish to refresh your memory on this go to this GDPRXpert blog.


The Public Accounts Committee

The Committee of Public Accounts (PAC) is) is a standing committee of Dáil Éireann which focuses on ensuring public services are run efficiently and achieve value for money. It acts as a public spending watchdog and by virtue of this role it has become one of the most powerful Oireachtas committees. It has a key role to play in ensuring that there is accountability and transparency in the way government agencies allocate, spend and manage their finances, and guaranteeing that the taxpayer receives value for money. PAC is a standing committee of the Dáil and is responsible for examining and reporting on reports of the Comptroller and Auditor General on departmental expenditure and certain other accounts. It also considers the Comptroller and Auditor General’s reports on his or her examinations of economy, efficiency, effectiveness evaluation systems, procedures, and practices.

Despite a recent adverse court decision relating to questioning of former Rehab Ireland CEO Patricia Kerins, the committee can rightly claim to do an excellent oversight job on behalf of the Irish taxpayer. Our view is clear. That particular episode was caused by some overzealous committee members and an overzealous chairman. ‘Over the top’ is the most appropriate colloquialism to describe the treatment of Ms Kerins. Giving the judgment of the entire court the Chief Justice stated, “the actions of the PAC as a whole were such they condoned the “significant departure” by at least three members of PAC from the terms of its invitation to Ms Kerins to appear before it”. (See Irish Times, 29 May 2019, “Supreme Court says PAC treated Angela Kerins in ‘unlawful’ manner”). The most consistent criticism stemmed from the manner in which PAC acted outside its remit and terms of reference.

Our view is that the PAC performs an excellent oversight job to ensure value for money for the taxpayer. Data protection consultants were impressed by the committee when it recently had Helen Dixon and some of her staff at a hearing in September of last year (2019). are making that link available here. At present, the committee has an excellent chairperson in Sean Fleming, and well-briefed committed members.

Apple is  happy to appeal

The Apple Money

There was much criticism from public representatives, the media and the general public when the Government decided to appeal the decision in the Apple case. Indeed, Fintan O’Toole described it as a disastrous miscalculation. The European Commission had found that Ireland had provided €13BN to Apple, which in the opinion of the Commission represented illegal state aid under EU Competition Law. The Commission said Apple’s tax arrangements in Ireland gave it ‘a significant advantage over other businesses that are subject to the same national taxation rules’, violating EU state aid laws. Although the government had indicated back in 2016 its intention to appeal the decision it was still compelled to collect the money owed. Over €14BN (principal amount + interest) was placed in an escrow account by Apple, until the appeal process is concluded. At the end of last year, the government confirmed that over €7Million had been spent on legal fees, consultancy fees, and other related costs.

Money, money, money.


Bearing in mind the role of the PAC which we have described earlier, it was to be expected that the committee may have had questions about the use of public money in the context of this appeal. Legal fees formed the bulk of the costs associated with the appeal to date, and the appeal process is still not exhausted. There is a possibility that, depending upon the result from the lower General Court, the case could yet end up before the CJEU and drag on for a few more years. The knowledge that this possibility was real may have augmented the desire of the PAC for some further information on the value for money aspect of the legal fees. The Dept of Finance was responsible for the payment of the legal and other costs associated with the appeal.

The GDPR Perspective

Prior to the introduction of the GDPR there never seemed to be an impediment to the release of legal fees charged by legal teams involved in, for example, the various tribunals over the years. Legal firms were named and their charges were public knowledge (thanks to the terms of reference and /or the FOI Act). A PAC report from January 2011 details how legal fees can reach exorbitant levels and the vast amounts paid to individual legal professionals. Again, there is no surprise and nothing unexpected or unusual in the PAC requesting the information on barrister charges in relation to the Apple appeal.

What is surprising is the response of the Dept. of Finance to this request for information.
A response from the Dept. briefly outlined its reason for its non-compliance with the request for information. In essence, the Dept cited the GDPR as the justification for not acceding to the request. The rationale seems to be very simplistic and dogmatic:
The information is personal data under the GDPR;
We  have a lawful basis to process  personal data but in this case, our advice is not to share  the data;
The  individual right to privacy trumps any right the PAC may have to access the data; and
that’s our story and we’re sticking to it!

Individual’s right to privacy V Public Interest


Some possible solutions

Names of tax defaulters are published by the Revenue Commissioners. The commissioners have a clear legal basis for this under the Tax Consolidation Acts. Despite being underpinned by legislation it still represents an interference with privacy rights. Crucially, it is not disproportionate and is done in the public interest. It is arguable that this is much more invasive than a barrister’s fees being disclosed to the PAC. Any barrister doing legal work for govt. departments would expect that their fees could be reviewed by civil servants and others at some point in the future.
There are no confidentiality agreements regarding fees for legal work done for the State. Legal privilege is one thing. Legal confidentiality over fees charged is a whole other thing. Transparency and accountability are overriding factors when it comes to assessing taxpayer value for money spent.

Historically, the practice of disclosing the names of barristers, along with the fees paid to them by Government departments and public bodies, is a longstanding one, and the refusal to disclose similar information represents an unannounced change of practice. Citing the GDPR as the reason for this change of practice is unjustified. The GDPR does not preclude the information on any barrister’s fees being disclosed to the PAC.

....or Public Interest Please

The routes available to the PAC

Art.6 (1) (f) of GDPR provides an appropriate legal basis exists for the PAC to process the personal data concerned, i.e the names and fees charged by individual barristers. It states, “processing is necessary for the purposes of the legitimate interests pursued by a controller or a third party except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject…” Here is a valid reason for the Dept. of Finance to furnish the details. PAC is not a “public authority” for the purposes of the GDPR or the DPA 2018, and so strict limitations on the use of the “legitimate interests” basis do not apply. (See Recital 47, GDPR)
Under s.60 of the Data Protection Act 2018 restrictions are set on the obligations of data controllers and the rights of data subjects for “important objectives of general public interest”. The rights and obligations referred to are those under Arts 12-22 and Art 34 GDPR. S.60 (3) (c) DPA 2018 continues with restrictions where the personal data are kept “by the C&AG for the performance of his or her official functions”.


Bearing in mind the role of the C&AG (The C&AG’s mission is to provide independent assurance that public funds and resources are used in accordance with the law, managed to good effect and properly accounted for and to contribute to improvement in public administration) it is proper that the information the PAC is seeking would be available without question to the C&AG from the Dept of Finance. It is certain that the C&AG would look favourably on any request from the PAC for the details of the legal charges they are seeking. There would be a clear understanding by the C&AG of the legitimacy of the request from the PAC. Unlike the action of the Dept of Finance, there would be no hiding behind the GDPR.

If complications and confrontations continue in relation to requests by the PAC for information that contains personal data, there is a longer-term measure that could be utilised. This would involve amending the Data Sharing and Governance Act of 2019. A most appropriate amendment is one that includes the PAC within the definition of “public body”. Personal data from other public bodies could then be shared with the PAC. Appropriate restrictions could be placed on the categories of data to be shared. Data sharing within the amended act would be such that is necessary and proportionate to facilitate the proper functioning of the PAC in “ensuring public services are run efficiently and achieve value for money”.
However, it never should have to come to this. It would not if departments such as the dept. of finance looked at the request in light of the public interest and in the light of the work the PAC does in the public interest. The PAC places transparency and accountability foremost in its quest to ensure public money spent achieves value for money.

In a letter to the PAC, Deputy Commissioner Dara Walsh reiterated a view shared by many within the data protection community. This view is that the privacy interests of individual barristers do not trump or override the public interest in seeing how State money was being spent. “Barristers could have no expectation that the legal fees expended by the DPC as a public body would not be subject to parliamentary and public scrutiny,” he concluded. Furnishing the details of fees to the PAC may also serve to show there is or there is no impropriety involved. Simply put: barrister A is not getting all the work.

Somewhat ironically, Graham Doyle, deputy data protection commissioner, said the DPC was also recently before the PAC and asked about similar payments to third-party organisations and individual service providers, such as barristers. Not only did it provide the information on the companies, but also gave a detailed breakdown on individual barristers, and this was after the introduction of the GDPR ( ) The commonsense answer suggested by the PAC, and supported by the DPC, is that people tendering for such work be made aware their payments will be publicly disclosed.
P.S. Considering that a general election has just been announced, we will repost a previous blog on the GDPR and elections. It is important that candidates and voters are aware of rights and responsibilities, at a time where personal data are being quickly processed.
Patrick Rowland,

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit



The DPC is not infallible.

The  DPC is not infallible, and so it is wise to remember that data controllers have legal rights.

There is no doubt that much time has been spent in the media and on this forum in debating aspects of the Public Services Card. Data protection consultants GDPRXpert first reported on this in a blog way back in 15 Feb 2019.  We rightly predicted the main conclusions resulting from the recent investigation by the office of the DPC into the legitimacy of the Public Services Card. Some of the concerns that the DPC was likely to focus on in the continuing contentious debate were highlighted.

At that time many feared the PSC represented the introduction of a national identity card by stealth. GDPRXpert wrote at the time that “The government vehemently denied this, and different Ministers for Social Protection (Burton, Varadkar, and Doherty) regularly appeared in the media to explain and defend the purposes behind its introduction and certify its bona fides. It was just a convenient card with no other purposes than to cut down on benefit fraud and streamline operations. Everything now should work more cost-effectively and taxpayer money would be saved.” There is still little impediment standing in the way of its use as a de facto national identity card (See Adrian Weckler, “National ID Card Isn’t Dead” SINDO, Aug.18, 2019).

There was a follow up on PSC and biometric data on 21 Feb.   On 22 Aug. data protection consultants, GDPRXpert, discussed the DPC findings into the investigation of the PSC. A report was issued and recommendations were made to the Govt.

Three central issues were to the fore in the report:

The lack of lawful bases for processing personal data, apart from processing by the DEASP;

Lack of transparency;  ( in terms of what personal data it processes in the context of SAFE 2/PSC, for example, how that data is updated and shared with other public sector bodies for the purposes of decision-making) and

Retention of data beyond what is necessary. (In particular, the retention of supporting documentation that was demanded in support of an application was excessive.)

Data protection consultants GDPRXpert have the DEASP link to the report available now.

Minister Regina Doherty: ‘We don’t agree with any of the eight findings and we have written to the commission to confirm that.’ Photograph: Dara Mac Donaill / The Irish Times


In total, the DPC made 8 adverse findings in relation to the card’s introduction and operation. The government disagrees with each of these findings according to Minister Doherty. When publishing the findings of her report, Dixon said the Department had 21 days to provide an update on how it was implementing the finding that it was no longer lawful to require a PSC for services other than welfare. On Sept 5 the 21 days had expired.

Minister for Employment Affairs and Social Protection Regina Doherty has said her department will not comply with any of the directions from the Data Protection Commissioner (DPC) on its Public Services Card project. “We won’t be complying with any of the instructions with regard to the findings or the instructions in the letter,” the Minister told RTÉ Radio.(

The Government believes that it would be potentially unlawful to withdraw or modify the PSC. A statement confirmed that its intention is to continue to operate the PSC and the Safe 2 identity authentication process on which it is based. Despite the controversies, the PSC remains popular, with 96 percent of those surveyed saying they were either very satisfied or fairly satisfied with the process. (Irish Times, Sept 17, 2019)

The reactions to the Government ‘daring’ to challenge the findings of the DPC have been surprising. Most data protection consultants would agree that the PSC has a lawful basis, but only in relation to its use for welfare related services through the DEASP. We have previously highlighted as unlawful any demand for the card in relation to other services unrelated to DEASP, such as passport, driving licence and more. Agree or disagree, the Government, just like a private citizen, has the right to appeal findings or a decision of the DPC. To deny or question this is to deny or question a basic tenet of the rule of law: access to justice and judicial review. GDPR will always be interpreted in light of the European  Charter Of Fundamental Rights and in this instance Art. 47 is the most applicable. The independence of the DPC does not mean it “cannot be subject to control or monitoring mechanisms…or to judicial review” ( Recital 118, GDPR).

Some of the groups foremost in the criticism are groups whose mission embodies supporting the rule of law. e.g. The Irish Council for Civil Liberties, but the ICCL has been opposed to the PSC from the start. Its opposition to it has been based more on ideology than on law. “This card unfairly targets economically marginalised people who depend on the State for their welfare payments. It also works in a gendered way, being a requirement for mothers collecting child benefit. Though the DPC report did not focus on these issues, ICCL believes that the structural inequality inherent in the card may well render it illegal”.  (See ICCL website)

The DPC did not focus on ‘these issues’ for good reason: they are completely tangential. Opposition to the proposed body cameras, to be used by the Gardai has also been voiced by the council. Again, this seems more ideologically driven, than legally focused. In a recent poll, over 90% of respondents had no privacy or data protection concerns about the use of body cameras by Gardai.

Here come the legal bills!


The DPC has never claimed to be infallible. Previous cases such as the   Shatter case and the original Schrems case prove it is not.  Indeed, neither has any court claimed to be infallible. A superior court overturning a lower court decision is not out of the ordinary.  It is simplistic to say the lower court ‘got it wrong’ (but courts do ‘err in law’). In the majority of cases, there is at least some substantive legal validity in differing court opinions. Higher courts may overrule lower courts, but when appeals are all exhausted it has to come down to the decision of that final court. Ideally, the final decision is one that meets the highest threshold of justice and equity. Justice must be done and seen to be done.

In the context of the Government appealing the findings of the DPC, there may have been a rush to comment. At this stage, the DPC has not yet made an Enforcement Order. The chief civil servant in charge of the controversial Public Services Card project has said that his department would not be challenging findings of illegality against the card unless it was “absolutely sure that a challenge was not only appropriate but necessary”. Appearing before the Dáil Public Accounts Committee – ostensibly to discuss his department’s most recently published accounts – secretary-general of the Department of Employment Affairs and Social Protection John McKeon wouldn’t be drawn on whether or not that challenge would serve to “undermine” the office of the Data Protection Commissioner, which operates as an independent state regulator.

Again, just because the DPC operates as an independent state regulator does not mean its decisions are above legal challenge by the Government. We can question the basis of any appeal if and when it arises, but we can not question the right to appeal itself. Graham Doyle, the DPC’s Head of Communications, told that the  Commission has declined the Department of Employment Affairs and Social Protection’s (DEASP) request for a meeting and plans to proceed with enforcement action. “I can confirm that we have this evening responded to the Department and have declined their request for a meeting. ( )

We await a decision on any enforcement action to be taken by the DPC. In an upcoming blog, we will look at the architecture of enforcement actions under the GDPR and Data Protection Act 2018, with an in-depth look at the appeal processes available.

Patrick Rowland, GDPRXpert.

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit

Long Awaited Ruling on The Right to be Forgotten.


Expert data protection consultants, GDPRXpert, examine the recent Google Right to be Forgotten ruling (Case C-507/17).

The case stemmed from an initial request for a preliminary ruling by the French data protection regulator. (Request for a preliminary ruling from the Conseil d’État (France) lodged on 21 August 2017 — Google Inc v Commission nationale de l’informatique et des libertés (CNIL))


The implications of the decision in the recent ‘Right To Be Forgotten’ case are likely to be far-reaching and controversial. Before any understanding of these implications can be grasped or a sober and objective assessment made, some knowledge of the context and background is necessary. What EU legislation, and in particular the GDPR,  sets out about the right will act as an additional tool in assessing the rationality of the conclusions reached in the case.  In the light of those conclusions, where does the Right To Be Forgotten (RTBF) now stand? A more insightful question is where should the right now stand? Not everyone will agree on this. Some views may mirror sentiments surrounding the GDPR itself that qualified data protection consultants, such as GDPRXpert, have commented on previously.  

Background and Context to the Case.

It has long been recognised that the RTBF exists under EU law. This has been evident since the 1995 Data Protection Directive (‘The Directive’) and from previous case law. More recently, Art. 17 GDPR  has set it out clearly. What is also established is that the right is a qualified right and not an absolute right. A normal consequence is the balancing of the right against other rights;  against those rights that may be competing in the same sphere. The European Court of Justice (CJEU) in a seminal 2014 case, widely referenced as Google Spain, held that Google was a data controller in its processing of personal data relating to the operation of a search engine.



Google Spain Case C-131/12 (13 May 2014).

In Google Spain a lawyer (the applicant) was objecting to the fact that anyone who searched his name on the Google search engine would obtain links to an article in a newspaper. That article reported the details of a court attachment order against the applicant for the recovery of social security debts. What is noteworthy is that the case pre-dates the  GDPR. It was a case that initially fell for consideration within the ambit of ‘The Directive’, and specifically Articles 12(b) and 14 (a). Mr. Gonzalez, the lawyer applicant, was seeking to enforce his right of objection. He felt that the material reported in the newspaper article was creating negative publicity, and reflected badly on him in his professional capacity. Some events reported in the article concerning M. Gonzalez had taken place 16 years previously.

Google had no control over the material in the newspaper report, yet it was directing the purposes and means of indexing. Anything that showed up when the applicant’s name was entered in the search box was the result of Google indexing.  Material on third party websites is not controlled by Google. In this case, the information on M.r Gonzalez is still available in the newspaper publication and can be accessed without the help of Google. Nevertheless, Google was ordered by the Court to comply with the request for erasure.

Data protection rights v Freedom of expression and information

The Court held that where a person’s name was used in the search, the search engine is obliged to remove from the list of results any links to web pages published by 3rd parties, and containing information concerning that person. This stands even when the publication of the information on those pages is lawful. On the facts of the case, the Court held that individuals may request search engines to remove links to inadequate, irrelevant or excessive content relating to them online. In this particular case, the interference with a person’s right to data protection could not be justified merely by the economic interest of the search engine.

After Google Spain

Defining the exact parameters and contours of the judgment have stoked uncertainty and fostered controversy for years. As soon as the ruling was announced Google introduced new internal procedures. These procedures were to facilitate changes that the ruling demanded, and enable it to assess requests for erasure. Every request had to be assessed on its own merits to apply the criteria mentioned in EU law and the European Court’s judgment. These criteria relate to the accuracy, adequacy, relevance – including time passed – and proportionality of the links, in relation to the purposes of the data processing (paragraph 93 of the ruling).

Where is that information?


Following a successful request, the principal new procedure known as ‘geo-blocking’ will come to the fore.  Geo-blocking, as the word suggests, operates to block access to the information from a searcher’s domain ( More on this later).  After the Google Spain case and up to late 2018, Google had received over 700,000 requests for erasure.  Over 40% of these were categorised as well-founded, and consequently, the related search results were de-listed. One pre-requisite is that the search is based on the person’s name. Other searches, not based on the person’s name, can still lead to the information in the third party link or the link can be accessed directly.  A person would have to put in a request with the data controller for the third party website in order to secure erasure of personal data on that website.  We emphasise again the nature of the right; qualified and limited.

Google and the French Regulator

Google commenced the process of de-listing results. However, the structure and methodology of the de-listing did not meet with the full approval of the French regulator. There was a reason for this. When Google initiated the new de-listing procedure they only de-listed in relation to EU domains such as, google. fr, google. de, and so on. Domains outside the EU, such as, were unaffected, resulting in the information remaining conveniently available. In 2016 Google had introduced the geo-blocking feature that prevented European users from viewing the de-listed results, but it resisted censoring results for people in other parts of the world.  From the viewpoint of the French data protection regulator, Commission Nationale de l’Informatique et des Libertés (‘CNIL’), this was unsatisfactory.


What CNIL Wanted

CNIL argued that by only de-listing the EU domains, Google was not giving data subjects’ personal data the protection that the judgment in the case had envisaged. It followed from this, that to ensure full protection of the personal data of data subjects, erasure of the personal data should happen worldwide. If this was not to happen, the certain consequence was going to be access to the personal data via other domains. Other methods, such as circumvention through the use of a Virtual Private Network (VPN) could also be used.

For Google, de-listing worldwide was a disproportionate measure and placed an over onerous burden on the operation of its search engine. (GDPRXpert recently looked at disproportionate measures in the context of the visitor books at OPW sites).  Applying the RTBF ruling in jurisdictions that had strong constitutional protection for freedom of expression and free speech, such as the U.S., was judged as problematic. Google appealed the decision. Principles of territorial jurisdiction and global data flows that seem incompatible with each other must now undergo more judicial scrutiny.

Article 17 GDPR

Google v CNIL was always going to be a complicated case as the array of issues involved was open to differing interpretations. To further complicate the issues, the introduction of the GDPR in May 2018 effectively repealed the old Directive.   Google Spain considered Article 14 of Directive 95/46, but Article 17 GDPR now broadens out the circumstances where the right to erasure will apply. Consequently, there was an inevitable focus on interpreting its application and relevance to the facts in this particular case.

This ‘new right’ to erasure (‘right to be forgotten) is set out under Art. 17 of the GDPR. The grounds for erasure (Art. 17 (1) are enumerated, and the controller is obliged to erase personal data without undue delay where those grounds apply. Primary grounds for erasure include (but are not limited to): the data are no longer needed for their original purpose; consent has been withdrawn and there are no other legal grounds for the processing; the processing was unlawful in the first place and; erasure is required under EU or Member State law. Grounds for refusing to erase the personal data (Art.17 (2)   are also set out, but these are very limited, and only will apply ‘where the processing is necessary’ under those stated grounds.

That word ‘necessary’ crops up again and is open to interpretation. Certified GDPR and data protection advisers, GDPRXpert, have explained in previous blogs how the word ‘necessary’, in the context of the GDPR, means more than ‘useful’ or ‘convenient’.  We saw previously how much of the debate surrounding the Public Services Card shifted and began to examine specific aspects of the card. For example, when exactly was processing deemed ‘necessary’ in relation to a stated particular purpose?

The RTBF is simultaneously more ambiguous and ambitious than other rights and is likely to be the subject of more legal challenges. Different competing rights, ones that require balancing against one another, will lead to most of the confrontations. What is most likely to be the battleground will be the intersection of the RTBF with the right to freedom of expression and information. Strategists of the opposing factions may be forced to look to the degree of erasure or whether any item of data can ever be truly and permanently erased. One thing is certain: nowhere in Art 17 GDPR does it mention de-listing information on a worldwide basis.  None of us need to be a courtroom advocate, but the foregoing should provide us with sharper interpretive tools to assist in our own analysis of the final decision in Google v CNIL .


Google v CNIL

At the core of the case, there are two differing perspectives. Google is focused on broader economic and societal implications. CNIL is looking through the prism of individual data protection rights. Four questions were submitted to the Court for a preliminary ruling by the French Conseil d’état :

First, whether the de-referencing following a successful request for erasure must be deployed in relation to all domain names irrespective of the location from where the search based on the requester’s name is initiated, even if that occurs outside of the EU;

Second, if the first question is answered negatively, whether the RTBF must only be implemented in relation to the domain name of the Member State from which the search is deemed to have been operated or, third, whether this must be done in relation to the domain names corresponding to all Member States;

Fourth, whether the RTBF implies an obligation for search engine operators to use geo-blocking where a user is  based in (i) the Member State from which the request for erasure emanated or (ii) the territory of the EU searchers non-EU domains.

Expert data protection consultants GDPRXpert have accessed some quality articles on the RTBF for this blog, such as, ‘Google v CNIL: Defining the Territorial Scope of European Data Protection Law’.

The Opinion in  Google v CNIL

A hint of where the case was going became clearer with the preliminary opinion of the Advocate General of the Court (CJEU) on 10 January 2019. With the opinion there came a re-statement of the order of rights. What was emphasised once more was that the RTBF involved a balancing exercise against other rights, and most especially against the right to freedom of expression. The Advocate General concluded that where a claim for de-referencing has been successful, the search engine operator should only be required to effect de-referencing within the EU. This was a non-binding ruling. In most cases, the full court at the Grand Chamber follows the opinion of the Advocate General.


The Grand Chamber Decision in Case-C 507/17

The Court held that “The operator of a search engine is not required to carry out a de-referencing on all versions of its search engine. It is, however, required to carry out that de-referencing on the versions corresponding to all the Member States and to put in place measures discouraging internet users from gaining access, from one of the Member States, to the links in question which appear on versions of that search engine outside the EU.”

It went on to cite Google v Spain and stated  that the Court had already held, “ that the operator of a search engine is obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, also in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful”.

Under the old Directive, and more recently under the  GDPR, Google Inc’s operations fell within the scope of EU legislation on data protection. Global de-referencing would meet the objective of protection of EU law in full, but there were other considerations. Numerous third States do not recognise the right to dereferencing or have a different approach to that right. The Court added that the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality.

Any balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world.  There was no evidence, in legal texts or anywhere else, that the EU legislature had struck such a balance. Neither was there any evidence that it had chosen to confer a scope on the rights of individuals going beyond the territory of the Member States. In addition, there was no evidence it would have intended to place a de-listing burden on an operator, such as Google, which concerns the national versions of its search engine, as distinct from those of Member States.

EU law does not provide for cooperation instruments and mechanisms as regards the scope of a de-referencing outside the EU. “Thus, the Court concludes that, currently, there is no obligation under EU law, for a search engine operator who grants a request for de-referencing made by a data subject, as the case may be, following an injunction from a supervisory or judicial authority of a Member State, to carry out such a de-referencing on all the versions of its search engine.”  Nevertheless, EU law does require a search engine operator to carry out such a de-referencing on the versions of its search engine corresponding to all the Member States.

A search engine must take sufficiently effective measures to ensure the effective protection of the data subject’s fundamental rights. What this means in practice is that any de-listing or de-referencing, “must, if necessary, be accompanied by measures which effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access, via the list of results displayed following that search, through a version of that search engine outside the EU, to the links which are the subject of the request for de-referencing”.

It will be for the national court to ascertain whether the measures put in place by Google Inc. meet those requirements. Lastly, the Court points out that, while EU law does not currently require a de-referencing to be carried out on all versions of the search engine, it also does not prohibit such a practice. Just as in Google Spain, it was acknowledged that removing irrelevant and outdated links is not tantamount to deleting content. The data will still be accessible, but no longer ubiquitous.

Patrick Rowland, GDPRXpert.

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit


The GDPR Gets the Blame Again.


GDPR has wrongly been blamed for many things since its introduction.  It has been scapegoated by sceptics, and some illogical interpretations of the regulation have led to disproportionate responses. Various interpretations, propounded by some, have no basis in data protection law and are just wrong. Nevertheless, the GDPR continues to get the blame.

Some Examples

Our No.1 is the hairdresser who cited GDPR as the reason she could not tell a customer what particular dye colour she was using in the customer’s hair!  At the time, the same customer was trying to get an appointment with another hairdresser, as her usual hairdresser could not fit her into her schedule. The customer wanted to be sure the correct dye would be used by the new temporary (perhaps to be the new permanent?) hairdresser. GDPR gets the blame!  Very inventive, but nonsense, of course.

‘Over the top’.

On the disproportionate scale is the guy who claimed to Joe Duffy that at the time of the last election, voting cards should have been shredded in front of voters once they had been presented to the election officials. One could make an exaggerated technical argument to try to support this, but there has to be a commonsense approach taken.  A, ‘verify and return’ approach is more practical and effective than a, ‘verify and destroy’ (shred) approach. How many shredding machines would have been needed in each polling station? Just think of the general layout in most polling stations.  Certainly, in the larger ones, there are a lot of different sections and rooms.

Here is a case of getting the sledgehammer to crack the nut. Shredding the cards in front of voters is an example of an action that is disproportionate to the risk to voters. The sensible thing to do, which was done by officials, was to simply hand the voting card back to the voters.  This is completely in line with the storage limitation principle from Art. 5 (1) (e), “ …kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed…” Therefore, ‘verify and return’ was the most logical and commonsense action.

Visitor Books at heritage sites

This example leads us to the story of the visitor books at certain heritage sites. Attention was first drawn to this story by an article in the Irish Times. Data protection consultants GDPRXpert are providing this link to you now. The general theme is that GDPR concerns led to the decision by the OPW to remove the visitor books from certain heritage sites. In most cases, visitors were signing their names and giving partial addresses. Some visitors included very short comments.

“The Office of Public Works observed that visitors were recording personal data, including names, addresses, etc, in visitor books at our sites which were out of view of the staff and completely unsecured,” an OPW spokesman said. A view was taken by someone at the OPW that the personal data in the books were insecure. For example, someone could take a photograph of some page or pages of the book. We don’t know who would want to do that or why, but that possibility certainly does exist. But removing the visitor books from the sites? Best to examine some of the aspects to this in more detail.

Issue 1…Personal Data in the Books.

GDPR and data protection consultants, GDPRXpert,  have set out the definition of ‘personal data’ from Art.4 (1) on their homepage.  GDPR has a wider definition of ‘personal data’ than under the old data protection acts. There is no doubt that, in accordance with the newer definition, a name or an address or both constitutes personal data.


Issue 2…Are personal data being processed?

Art.4(2) defines processing as “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, consultation ,use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction”. It is clear that the personal data from the visitor books are being processed under several of the categories of processing outlined above, e.g. collection, use, recording, storage, etc. GDPR Art.4 (2) expressly states the processing does not need to be by automated means, and so the means can be manual.




Issue 3…Are the data part of a filing system?

The next question is if the manual entries of the visitors (names, addresses etc), which immediately become manual records, form part of a filing system? This is a requirement under the GDPR Art. 2(1), and if this criterion is not met then GDPR does not apply.  In this context, personal data must “form part of a filing system or are intended to form part of a filing system”. The Regulation defines a filing system as, “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”. (Art.4 (6), GDPR.) In all likelihood, the details in the visitor books would fail to meet the criteria to fall under the ‘filing system’ definition.

One aspect that does not seem to have been considered or mentioned is whether the OPW viewed the personal data as data, “intended to form part of a filing system”. If so, that intent would bring the personal data under the filing system umbrella. At any time in the future the personal data in the books could be transferred into electronic form, and then would constitute, “part of a filing system”.

Only the OPW can say what the exact purposes of the visitor books were, and whether there ever were plans to transfer data to electronic form. Even at the busiest heritage sites, regularly transferring personal data from the books into electronic form would not be a taxing duty on staff. However, there is no diktat for entries in the visitor books, and many visitors simply put something brief like, ‘John, Idaho, U.S.’ Many visitors seem to concentrate on comments around their personal appraisal of the experience itself.


Issue 4…Lawful basis for processing personal data

We have stated it time and time again, that just because you can process personal data does not mean you should. You must have a lawful basis. So if we conclude from the foregoing that personal data are being processed, then we must look for a lawful basis. It is likely that every visitor is aware that the act of writing a name or an address or leaving some comments is entirely voluntary. In other words, they are consenting. OPW could use the consent of the visitors as a lawful basis for personal data processing.

Under the  GDPR it is not quite as simple. People whose personal data are being processed (data subjects) need to be aware of the context of the consent. Consent to what? The definition of consent is that it is an ‘ …unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action signifies agreement to the processing of personal data relating to him or her’. As part of a normal personal data processing operation,  information on the purposes of the processing and a whole host of other information has to be given to the data subject at the time the data are collected. Data controllers need to know if GDPR applies to the processing operation in the first place.


It is not possible to use ‘legitimate interests’ as a lawful basis. For example, it is a legitimate interest of the OPW to conduct market research to make the visitor experience more enjoyable. Comments in visitor books would be helpful in this regard. A problem with this is that it may be helpful or useful to the OPW, but not, ‘necessary for the purposes of the legitimate interests pursued’, as required under GDPR. In this instance, the OPW had stated the office had no purpose or use for the visitor books at all. This begs the question, why have them at all? OPW surely has some use for them. What does it do with them when they are full? Bear in mind, ‘storage’ qualifies as a processing operation. If there is an intention to use the personal data as part of a filing system, then the OPW should be transparent about it. Where the policy is to wait until they are full and then put them in storage, the OPW should say this.


This relates directly to the purpose of any processing. So, if the OPW does intend to do something with the personal data at a later stage, they should let visitors know as soon as they know themselves. It is inconceivable that no one later goes through the books to see what visitors had to say. These books offer a tool for valuable market research on visitor experiences. There are many reasons to carefully examine the visitor books. Do they want to get statistical data on visitor country of origin? What they did and did not like? Comments left in the books could positively influence the management decisions around operational practices at the sites. Somewhat strangely, in the opinion of data protection specialists GDPRXpert, the OPW told the Irish Times that they didn’t really have a purpose for processing the personal data. Therefore, as they did not have a purpose, and a purpose is required under GDPR, they discontinued the practice of placing visitor books at heritage sites.

On balance, it is unlikely that the visitor books would fall under GDPR because of the ‘filing system’ definition. There are strict requirements before something falls under the definition. It is clear visitors are giving their personal data freely. Perhaps visitors do it unthinkingly or instinctively, but in the belief, the entries will be useful in some way. They are volunteering helpful feedback for the OPW.

At the least, even if GDPR is not applicable, OPW should display a short notice beside the visitor books.  This should inform visitors that they may if they wish to do so, leave entries in the book, but advise them to keep personal details to a minimum. After all, the comments are potentially more valuable to OPW than personal details. At this time visitors should be made aware of the uses, if any, the OPW has in mind for the data. Who is going to make any entries if the notice says ‘we destroy the books every Friday at 5?’

A recommended policy is to be transparent and say something on a notice,such as, ‘we go through the comments for feedback to help improve visitor experience’. If that is the plan, it can further state that when this is done the books are archived.   If the OPW is worried about people taking photographs of entries in the books they should place a sign beside the first notice stating, ‘NO PHOTOS HERE’.  Ideally, the books could be placed at an exit point where there is normal security or staff presence.

Visitors do presume that by making an entry in the books it will be of some value to the management. They also presume that someone will, in some way, extract this value. Removing the books for data protection concerns was a complete overreaction to any potential risks. Even GDPR Art 32 made it clear that in ensuring … “a level of security appropriate to the risk …the nature, scope, context, and purposes of the processing as well as the risk…” be taken into account. Proportionality is a central concept embedded in the GDPR.  GDPRXpert, along with many data protection consultants, agreed with the DPC view that it was disproportionate.  The whole affair was an unnecessary storm in a teacup. Thankfully, reason prevailed and the books were later restored.

Patrick Rowland,

We are GDPR & Data Protection Consultants with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.




PSC Investigation Findings By the DPC.

Data protection consultants welcome the findings from the investigation by the DPC into the  Public Services Card. In a blog post back in February expert data protection consultants GDPRxpert  rightly predicted the main conclusions resulting from the recent investigation by the office of the DPC into the legitimacy of the Public Services Card. At the time we highlighted some of the concerns that the DPC was likely to focus on in the continuing contentious debate. The full report has not yet been made available by the Dept.of Employment Affairs and Social Protection (DEASP). However, the DPC has published some initial findings.


Some Backdrop

 As we stated in the earlier blog post, “Most of you will remember some controversy about this card at the time it was introduced, and it initially focused on one theory in relation to its introduction. For many, it represented no more than the introduction of an identity card by stealth. The government vehemently denied this, and different Ministers for Social Protection (Burton, Varadkar, and Doherty) regularly appeared in the media to explain and defend the purposes behind its introduction and certify its bona fides. It was just a convenient card with no other purposes than to cut down on benefit fraud and streamline operations. Everything now should work more cost- effectively and taxpayer money would be saved.” See the GDPRxpert blog post, “Public Services Card Debate Resumes” at .


Main Finding


Our earliest key finding was that the introduction of the card did have a solid lawful basis. It was underpinned by legislation. (We detail the sections under the Social Welfare Consolidation Act 2005 in our earlier blog.)  This concurs with the DPC finding. The introduction and use of the card in relation to accessing social services from the Dept of Social Protection was legitimate. That is where its lawful basis ended. What must be borne in mind is that the report was compiled in the context of events prior to the introduction of the GDPR. From a practical perspective, and because GDPR cannot be applied retrospectively, the report was based on data protection laws in force at the time. Here we refer to the Data Protection Acts 1988 and 2003 (‘the acts’). There is much in common between ‘the acts’ and the GDPR, but the GDPR has higher standards of transparency, accountability, and enforcement.


It was partly these lower general standards, but particularly the lower standard of transparency (than under GDPR) that revealed systemic illegitimacy. Retention of supporting documentation that was demanded in support of an application was excessive. Central to this criticism was the general lack of any definitive retention period policy but instead a ‘blanket and indefinite retention of underlying documents and information provided by persons applying for a PSC’. This contravened Section 2(1)(c)(iv) of the Data Protection Acts, 1988 and 2003 because such data was being retained for periods longer than is necessary for the purposes for which it was collected. Any information provided by the Department to the public about the processing of their personal data in connection with the issuing of PSCs was not adequate. One has only to look at the information now required under Arts. 12, 13 &14 GDPR to see the depth of the lower standards under ‘the acts’.

 Other Bodies

While the Dept of Employment Affairs and Social Protection (DEASP) had at least a lawful basis for the card, other departments and public bodies did not. They just began asking for it in the normal course of business. It is more accurate to say they demanded it. They had absolutely no lawful basis for this type of demand.  Both the Passport Office and the National Driving Licence Service demanded the PSC before allowing any applications through their offices. It is those other bodies and departments that lack a lawful basis entirely, and now they must cease the practice of  demanding the PSC. There will be much discussion, especially in government circles, over the next few weeks regarding the future of the PSC.  Many data protection professionals, included, have formed an initial consensus that the card is likely to continue in use, but only in connection with services from DEASP.

Some Immediate Measures.


The DEASP, “will be required to complete the implementation of two specific measures within a period of 21 days:

  •  It will be required to stop all processing of personal data carried out in connection with the issuing of PSCs, where a PSC is being issued solely for the purpose of a transaction between a member of the public and a specified public body (i.e. a public body other than the Department itself). The corollary of this finding is that bodies other than DEASP cannot insist that a person who does not already hold a PSC must obtain one as a pre-condition of accessing public services provided by that body.
  • The Department will be required to contact those public bodies who require the production of a PSC as a pre-condition of entering into transactions with individual members of the public, to notify them that, going forward, the Department will not be in a position to issue PSCs to any member of the public who wishes to enter a transaction with (or obtain a public service from) any such public body”. (From DPC statement)


We will return to the topic as things develop and add to this (shorter than normal) blog post very soon.  Prompt publication of the entire report would be beneficial to all parties.

Patrick Rowland,

GDPRXpert, GDPR & data protection consultants, with bases in Carlow/Kilkenny and Mayo, offer a nationwide service.

P.S.  3 Sept. 2019.     The deadline passed for the Department and no report was forthcoming. Indeed,  things have altered to the extent that it is unlikely the Dept. will release the report in the foreseeable future. Most data protection consultants, such as GDPRXpert agree with the findings by the DPC. However, it seems the Government is to challenge the findings of the DPC in court having taken legal advice from the Attorney General, and externally. See I.T. article on the latest.  So the saga continues. As they say, ‘watch this space’.

P.S. No. 2     Somewhat surprisingly, just a couple of days after this postscript the Govt. did publish the report of the DPC. See Irish Times article, ” The Irish Times view on the Government defiance of the DPC”, Sept. 19, 2019. Text following is from that article.

Key findings include a decision that the card cannot be required to obtain services from other departments because no lawful basis exists for such use. It cites numerous examples of the “mission creep” by which the card transformed from its original intention as a chip-and-pin verification device for social welfare services, into a required form of identity for seemingly random purposes, such as sitting a driving test, obtaining a passport, or appealing school transport decisions.

The report states that such examples illustrate “obvious and significant deficits in terms of logic and consistency” for when the card is required.

While such findings had been released earlier in summary form by the DPC, the full report adds significant heft and leaves little legal wriggle room for the Department. Yet the Government intends to defend the card, in direct defiance of a national regulator, with both the Minister and Taoiseach Leo Varadkar suggesting that the DPC should have met with the Department to “discuss” the findings. Thoughts on GDPR a Year On.

The GDPR is now over 14 months in operation. This blog post offers some thoughts on GDPR a year on. It is still a little early to have any kind of truly substantive analysis of the effectiveness of the Regulation to date. A difficulty that immediately surfaces is how to quantify its effectiveness. What is an appropriate measure or barometer of its effectiveness? Fines speak to enforcement, but without the specific details, little can be extrapolated, even in a general context.  If the level of fines is taken as a metric, does it mean that as fines rise the regulation is simply being enforced more? Do the increasing fines mean that the overall level of compliance is dropping?  It could be that fines are increasing because more organisations are, and in some cases choosing to be, non- compliant.

In any sphere of regulation there will always be a non- compliant percentage. Therefore, it does not follow that there is a direct relationship between fines and non- compliance. In other words, it does not mean that as fines go up, the level of non- compliance is also going up.  Fines are always going to be imposed as a deterrent, even in situations of high percentages of compliance. Some organisations that are being fined may be repeatedly and stubbornly non- compliant.  Any increase in reported data breaches that leads to the opening of an investigation may conclude with the imposition of fines.

Are regulators to go on ‘fishing expeditions’ to fine some high profile organisations? There has always been talk of the DPC and other regulators planning to go for some ‘low hanging fruit’ in the early days of the GDPR. Most experts put financial services in this category. No evidence of this has been found so far. In the case of undertakings, fines can reach 2% or 4% of global turnover, depending on the infringement.  So far, companies have been spared the harshest penalties that can be meted out. This is likely to change, according to the regulators across the EU.

In the lead up to the Regulation’s introduction, the general strategy of Supervisory Authorities (SAs) was to educate the public and organisations. It was the consensus amongst SAs that education was the best mechanism for regulation preparedness. Ignorance of the law (regulation) is never an excuse and so an EU wide focus was on promoting education on the regulation.  ‘GDPR Awareness-building’ was the process chosen to direct the education mechanism. The ultimate goal was to foster and nurture compliance with the new regulation, and develop a culture of compliance over time.  The general public was to be made aware of rights, and organisations were to be made aware of obligations and responsibilities.  A thorough understanding of the core principles of transparency and accountability was highlighted as a mandatory requirement for competent data controllers and processors.

What has been happening since the introduction of GDPR?

In a previous blog, we examined some stats. from the DPC’s first annual report, post-GDPR. Most noteworthy was the number of data breaches adding to a total of 3,452. Perhaps this should not be surprising. What was surprising was that out of this total number of breaches, the largest category was ‘Unauthorised Disclosures’, but the fines did not seem to follow.

The French regulator, Mathias Moulin, emphasised that the first year of GDPR should be considered a ‘transition year’. Transition year or not, early numbers for the GDPR make clear that the policy has been a success as a breach notification law, but largely a failure when it comes to imposing fines on companies that fail to adequately protect their customers’ data. Stephen Eckersley, the head of enforcement at the U.K. Information Commissioner’s Office, said the U.K. had seen a “massive increase” in reports of data breaches since the GDPR’s implementation. In June 2018, companies self-reported 1,700 data breaches, and Eckersley estimated that the total will be around 36,000 breaches reported in 2019, a significant increase from the previous annual reporting rate of between 18,000 and 20,000 breaches”. See,

Some Stats

Other reports give more information on GDPR to date . There were 89,000 data breaches recorded of which 37% are still pending investigation or penalties. 65,000 data breaches were reported to the European Data Protection Board. In the first 8 months of the GDPR nearly 60,000 breaches were reported across Europe. (Law firm DLA Piper).  Google was hit with a €50million fine for not making it clear to users how it was harvesting data from its own search engine, YouTube and Google Maps. This penalty was the largest and was issued by the French Data Protection Authority (CNIL) in January against Google. It was related to a lack of transparency, inadequate information and a lack of valid consent regarding the personalization of the ads.

French authorities had received complaints abourt  Google’s handling of personal data. CNIL, the relevant authority, found that the structure of Google’s privacy policy and terms and conditions were too complicated for users, and the use of pre-ticked boxes as a consent mechanism did not establish a legal basis for data processing to deliver targeting advertising. It is helpful for a better understanding of the fines regime to look at the broader context of the  Google fine more closely.


Some Fines

The French regulator cited Google’s failure to centralize essential information on one page, and its process requiring users to go through “up to five or six actions.”Google’s penalty accounts for nearly 90% of the total value of fines levied to date. But it had the potential to be much larger. In 2018 Google reported nearly $136.2 billion in revenues. Therefore, the 50 million euro fine represented approximately .04% of revenue, far from the 4% potential penalty. (Above 2 paras. from Compared to the Google fine, other fines levied by European national data protection authorities (DPAs) have been considerably smaller. For example, in March 2019 the Polish DPA announced that it had fined a company approximately 219,000 euros for failure to inform six million individuals that their personal data were being processed. Also, in March 2019, the Danish DPA fined a company approximately 161,000 euros for holding on to personal data longer than allowed under GDPR. (from the  same source directly above)

Outside of the Google fine, the penalties thus far have been so small that many are anxiously awaiting the next whopper of a fine. Irish and UK authorities have hinted that a large fine is coming. (Todd Ehret, Thomas Reuter’s, May 22, 2019). GDPRXpert has previously reported on Facebook’s difficulties with the office of the DPC here in Ireland, and ongoing investigations seem likely to conclude with large fines being meted out. See also GDPRXpert’s blog post,

European Data Protection Board Survey

It is somewhat surprising that, despite the awareness building campaign by all SAs, a European Data Protection Board survey found in May 2019 that;

  • only 67% of people across Europe had heard of GDPR;
  • 36% claimed to be ‘well aware’ of what GDPR entails;
  • 57% of EU citizens polled indicated they are aware of the existence of a public authority in their country responsible for their data protection rights


This result represents an increase of about 20% from a 2015 Eurobarometer. It is a disappointment that it is not higher, considering the cost, the scale and the scope of the campaigns in all Member States to educate citizens, prior to GDPR. Many people question whether the message of GDPR was communicated in a properly measured manner, before the introduction of the legislation.

 ‘Privacy Sweep’.

Some stats from the 6th annual  Privacy Sweep conducted by the Global Privacy Enforcement Network (GPEN)  reaffirm the veracity of the results above. Data protection consultants, have included elements of this in a published blog post.  Data protection authorities from around the world participated, and in the last sweep GPEN aimed to assess how well organisations have implemented accountability into their own internal privacy programmes and policies. One goal was to establish a sort of baseline of an organisation’s compliance with data protection. This was the brief for the DPC, as their input was targeted at randomly selected organisations in Ireland. 30 organisations across a range of sectors completed a suite of pre-set questions relating to privacy accountability. Because the sweep was done in the last quarter of 2018 only preliminary or provisional results are available to date of report (DPC Report, 2018).



Some Stats from ‘Privacy Sweep

Preliminary results include the following:

  • 75% appear to have adequate data breach policies in place;
  • All organisations seem to have some kind of data protection training for staff;
  • However, only 38% could provide evidence of training for all staff including new entrants and refresher training;
  • In most cases, organisations appear to undertake data protection monitoring/self -assessment but not to a sufficiently high level. In this category, 3 out of 29 scored ‘poor’, while 13 could only reach a ‘satisfactory’ level;
  • 1/3 of organisations were unable to show any documented processes in place to assess risks associated with new technology and products;
  • 30% of organisations failed to show they had an adequate inventory of personal data, while close to 50% failed to keep a record of data flows.


Is There Still a Wait and See Approach?

Businesses and organisations have reacted to the GDPR in their own way, depending on what they view as their individual exposure. There is no doubting the cost-benefit analyses done by some companies to quantify the potential fines v. the cost of compliance measures. This is especially so for companies that have no presence in the EU but fall under the GDPR by virtue of Art. 3. Most companies are taking a proactive approach to dealing with the new realities of personal data protection. Possibly, because many of the fines to date have been nominal compared to what they could have been, there are some companies that are waiting to see what the supervisory authorities in each EU member country are going to do. The prevailing wisdom is that fines will be going up as regulatory actions play out.

At the moment businesses and organisations are looking at the possibility of larger fines being imposed. Many are waiting to see if there are any trends emerging or are particular types of businesses being targeted.  So far, it seems only the very big companies have been targeted by the office of the DPC here in Ireland. Anecdotal evidence suggests smaller businesses do not see themselves as being on the DPC radar. The perception is that the DPC has too much on their plate and ‘bigger fish to fry’. Smaller businesses may very well fly under the radar for a short while longer.  However, one can assume, partly because of the prolonged bedding in period for the regulation, by the time the DPC gets around to some smaller fish, a high level of compliance will be expected. At that stage, no excuses are going to be accepted in any defence of non- compliance.

What is Likely to Happen in the Next Phase?

In the DPC Annual Report 2018,  there were  2,864 complaints, of these the largest single category was in the category ‘Access Rights’ with 977 complaints, or a little over 34%of the total. Here is a warning flag for businesses. All of these complaints have the potential to trigger an associated investigation by the DPC into an organisation’s compliance with the GDPR.  At least initially, it is believed supervisory authorities will take a more cautious approach to levying the harshest penalties, says Peter Milla, the data protection officer at Cint, a provider of consumer data sets to market researchers around the world, with global corporate headquarters in London.

“What’s going to happen is the regulators are going to come in to see if you have a compliance program but they’re going to be very lenient,” he says. “They’re obviously not going to put small companies out of business because there’s a political component here but they will fine. They’re going to be commercially reasonable. The Germans are probably going to be the harshest, Milla says.

One thing that has been forecasted is that there will be greater enforcement. Regulators across the EU have significantly increased staffing levels and it is logical to expect greater enforcement as a result.   Another expected development is that increasingly educated and ‘GDPR conscious’ consumers will drive data protection and privacy by design. The same consumers will be attracted to businesses and organisations that are seen to respect their data protection and privacy rights. A lack of either a Privacy Notice or a Privacy Statement is a clear indicator of an organisation’s clear disregard for the core principle of transparency. What is such an organisation doing with your personal data? That is anyone’s guess.

Patrick Rowland,,  with bases in Carlow/ Kilkenny and Mayo, offer a nationwide GDPR and data protection consultancy service.

Visit to learn more


Transfers of Personal Data outside the EU/EEA

In the most recent blog post we attempted to capture the context of some of the channels of transfers of data outside the EU/EEA. The Schrems case provided some of this, by its scrutiny of the Standard Contractual Clause mechanism. Since the introduction of the GDPR the channels of transfer of personal data to a third country or international organisation have undergone changes.

Transfers of personal data to third countries or international organisations.   

Following the inception of the GDPR the law on transfers of personal data to third countries or international organisations (‘transfers’) is more settled. A caveat is that the exact interpretation of express terms in the GDPR that relate to transfers may come before the Court of Justice for ultimate clarification.

Art. 44 GDPR provides that transfers may only take place if subject to the other provisions in the regulation, and the conditions laid down in Chapter V are complied with by the controller and processor. A plethora of conditions is laid out in Chapter V. These conditions can be grouped as transfers subject to:

Adequacy Decisions;

Appropriate safeguards; or

Specific derogations.


Adequacy Decisions

Art.45 allows transfers where the European Commission has decided that the third country or international organisation ensures an adequate level of protection. Under this scenario, no specific authorisation is required. In practice, this confers a broad discretion on the European Commission in assessing adequacy. This has the potential to be viewed subjectively and politically influenced. It was the discretion in declaring an adequate level of protection existed that led to Schrems ( Case C-362/14, 6 Oct. 2015) ending up before the CJEU. As a means to counter balance the discretion of the Commission, Art.45(2) sets out three elements that the Commission must ‘in particular’ take into account when assessing the adequacy of the data protection in the third country. A list of countries with an adequacy decision is found here.


Elements to be taken into account to assess Adequacy

  • ‘the rule of law, respect for human rights and fundamental freedoms…’ Legislation, both general and sectoral, is examined. Are there adequate protections available when assessed   in the light of legislation concerning public security, defence, national security and criminal law? How about access of public authorities to personal data and the implementation of legislation above? What about data protection rules, professional rules, security measures and rules for onward transfer of personal data to another third country? Can data subjects gain effective administrative and judicial redress where they have complaints about how their data are being transferred?
  • ‘the existence and effective functioning of one or more independent supervisory authorities in the third country…’ The Commission should expect to see a supervisory authority with responsibility for ensuring and enforcing compliance with data protection rules, including adequate enforcement powers. It is not enough to have responsibility for enforcement, but it must have the powers to deliver on enforcement. Toothless tigers are not wanted.
  • ‘the international commitments the third country or international organisation has entered into..’ Something like this can act as an accurate gauge as to the value placed on international norms and rules. Part of this element of assessment can include scrutiny of international obligations the third country may have, as a result of some legally binding convention or instrument. Does the third country or international organisation participate in multilateral or regional systems, especially in the data protection sphere?

 The Goal

In essence, the goal is to have similar, if not identical, means of protection of personal data operating in the third country as is available to data subjects in the EU/EEA. As noted in the Schrems case, there must be an appropriate balance struck between the powers assigned to authorities in a third country and the protections provided for the persons whose personal data is being transferred. If the Commission is satisfied with the integrity and substance of data protection in the third country, it may then issue an Adequacy Decision. Any Adequacy Decision must be monitored and reviewed over time (Art.45 (4)) and can also be repealed, amended or suspended (Art. 45(5)).


Transfers Subject to Appropriate Safeguards

In the absence of an Adequacy Decision a controller or processor may transfer personal data to a third country or international organisation only where;

  • the controller or processor has provided appropriate safeguards; and
  • on condition that enforceable data subject rights and effective legal remedies for data subjects are available.

These appropriate safeguards can be provided in a number of ways and some need no specific authorisation from the Supervisory Authority (SA). Art. 46 (2) sets down the list of those not needing SA authorisation:

  • a legally binding and enforceable agreement between public authorities or bodies;
  • binding corporate rules in accordance with Art.47 ( more below);
  • standard data clauses adopted by the Commission ( in accordance with an examination procedure laid out in Art. 92(3));
  • standard data protection clauses adopted by the SA and approved by the Commission;
  • an approved code of conduct pursuant to Art. 40 together with binding and enforceable commitments of the controller or processor in the third country to apply appropriate safeguards;
  • an approved certification mechanism pursuant to Art.42  together with the same binding and enforceable commitments as above.

Of those listed above the most common mechanisms are Binding Corporate Rules (BCRs) and Standard Data Clauses. BCRs are “personal data protection policies which are adhered to by a controller or processor established on the territory of a Member State for transfers or a set of transfers of personal data to a controller or processor in one or more third countries within a group of undertakings, or group of enterprises engaged in a joint economic activity” (Art.4 (19)).

Recital 101 advises that a group of undertakings engaged in joint economic activity should be able to make use of BCRs for international transfers from the Union to organisations within the same group, provided the BCRs contain all essential principles and enforceable rights to ensure appropriate safeguards for the transfers of personal data. Competent SAs may adopt  BCRs, but the Commission itself may specify the exact  format and procedures for the exchange of information between the controllers, processors and SAs for those BCRs. Otherwise it is a matter for the SA to approve the BCRs.

Art.47 (2) sets some pre-conditions on any approval of the BCRs. First, they must be legally binding, and apply to and be enforced by every member of the group of undertakings engaged in the joint economic activity. Second, they must expressly confer enforceable rights on data subjects with regard to the processing of their personal data. Third, they must fulfil the requirements set out in GDPR Art. 47(2).


The Content of the Binding Corporate Rules

This same Art.47(2) lays down a comprehensive list of specific requirements for the content of the BCRs. It is not within the scope of this blog to enumerate all these requirements but they should be examined carefully in the text of Art.47(2).  There is no hierarchy of requirements but some on their face seem more important than others. A detailed analysis of the requirements for Binding Corporate Rules is laid out in this Ar. 29 Working Party document . It is a very comprehensive examination of the requirements and an excellent reference to satisfy any query.

Some of the Requirements

To be valid and acceptable the BCR must contain the structure and contact details of the group of undertakings/enterprises engaged in the joint economic activity and its members. All data transfers or sets of transfers, including the categories of personal data, the type of processing and its purposes, the type of data subjects and the identification of any third countries, must be clearly enumerated as part of the contents of the BCR.  Data protection principles are applicable and the rights of data subjects are to be expressly recognised including a right to obtain redress and, where appropriate, compensation for a breach of the BCR. Controllers or processors must accept liability for any beaches of the BCR by any member not established in the Union. Other requirements are laid out in Art. 47(2).

Standard Data Protection Clauses

For many organisations these clauses are the most usual mechanism to transfer personal data to a third country or international organisation. These are more common than adequacy decisions but they represent a minimum standard for data protection and for this reason it is envisaged (See Recital 109) that controllers and processors will add additional safeguards. The clauses must contain the contractual obligations of the ‘Data Importer’ and the ‘Data Exporter’ and confirm the data protection rights of the individuals whose data are being transferred.  Individuals can directly enforce those rights against either of the parties.

Standard clauses have been issued under the old Directive and these remain valid. However, the European Commission has advised the European Data Protection Board (EDPB) that it is planning to update the Clauses for the new GDPR. The Commission has made available the sets of Standard Contractual Clauses issued up to now.

Safeguard Mechanisms Requiring Specific Authorisation.

 Other mechanisms allow for the transfer of personal data to a third country or international organisation but these need prior specific authorisation from the SA. Safeguards in these cases may be provided by a) contractual clauses between the controller or processor and the controller, processor or recipient of the personal data in the third country and b) provisions to be inserted into administrative arrangements between public authorities or bodies which include enforceable and effective data subject rights. The consistency mechanism referred to in Art.63 is to apply to such authorisations. For example, where the SA aims to authorise contractual clauses it shall communicate that draft decision to the Board (i.e., the EDPB).

Specific Derogations

Where there is neither an adequacy decision available under Art.45, nor appropriate safeguards pursuant to Art. 46, a transfer of personal data can still take place if one of the conditions set out in Art. 49 is fulfilled. These conditions include:  explicit consent of the data subject to the data transfer, having been informed of the possible risks; the transfer is necessary for performance of a contract between the data subject and the controller or the implementation of pre contractual measures taken at the data subject’s request ; transfer is necessary for the performance of a contract concluded in the interest of the data subject  between the controller and another natural or legal person ( the foregoing do not apply to activities carried out by public authorities in the exercise of their public powers) ;

Specific Derogations contd.

… the transfer is necessary for important reasons of public interest ;the transfer is necessary in order to protect the vital interests of the data subject or of other persons, where the data subject is physically or legally incapable of giving consent; the transfer is necessary for the establishment, exercise or defence of legal claims; the transfer is made from a register which in accordance with Union or Member State law is intended to provide information to the public and which is open to consultation either by the public in general or by any person who can demonstrate a legitimate interest. Apart only from explicit consent, in all other cases the transfer is dependent on the transfer being deemed ‘necessary’. In practice the conditions are strictly applied and strictly interpreted with the result that it is preferable to use some other mechanisms to transfer data to third countries or international organisations. There is one final option if all other mechanisms or conditions are not present or available.

‘Last Resort’ Transfers of Personal Data

Where a transfer cannot be based on an adequacy decision or  appropriate safeguards, including binding corporate rules,  and none  of the derogations for specific situations apply, a transfer of personal data may still take place only if:

  • the transfer is not repetitive;
  • the transfer concerns only a limited number of data subjects;
  • the transfer is necessary for the purposes of compelling legitimate interests pursued by the controller, provided they are not overridden by the interests or rights and freedoms of the data subject; and
  • the controller has assessed all the circumstances surrounding the data transfer and has on the basis of that assessment put in place suitable safeguards for the protection of personal data. In addition, the controller must inform the SA and the data subject of the transfer. Any compelling legitimate interest pursued must be communicated to the data subject, together with  all the information requirements of Arts. 13 and 14.

The Recitals regard the last basis as one to be relied on, ‘in residual cases where none of the other grounds for transfer are applicable…’( Recital 113).

In most cases personal data transfers to third countries or international organisations are routine and uncomplicated. A complicated part is   knowing whether those transfers are legally sound or not. The prudent route is to follow the text of Arts. 45-49 and be aware of changes, such as CJEU decisions. Should the UK leave the EU without an agreed deal then the UK will become a ‘third country’ for the purposes of the GDPR and data transfers. If there is a ‘no deal Brexit’, data transfers to the UK will have to follow one  of the routes described in this blog.

Patrick Rowland,


Latest News