DPC decision in Tik Tok Case

The DPC decision in the TIK Tok case has been welcomed by data protection practitioners.  It is  is one we have been meaning to discuss but did not get around to it until now. Also, The DPC Annual Report for 2023 has just been released and we will examine it in detail in our upcoming July/August 2024 blog.
As the report states, “2023 was a busy year in personal data rights protection. The year saw a significant increase in complaints dealt with by the Data Protection Commission (“DPC”) with record fines issued and corrective orders imposed following cross-border and national inquiries. More generally, there were a large number of data protection-related judgments from the Court of Justice of the European Union and continued domestic focus before the Irish courts”. Perhaps not coincidentally, 2023 was also a busy year for GDPRXPERT and those operating a data protection advisory service. In particular, there was a high demand for the outsourced data protection officer service provided by GDPRXPERT.ie

Naturally, many of the high profile cases taken and concluded by the DPC in 2023 feature prominently in the report. One of these is the Tik Tok case concluded in September 2023. This concluded with a fine of €345 million for TIK Tok. TikTok Is a video focused social media platform that allows registered users to create and share videos of varying duration and to communicate with others through messages. TTL states that TikTok is not a social network and is, rather, a global entertainment platform that, at its core, was designed to enable users to create and share video content, enjoy videos from a variety of creators and otherwise express their creativity, such as by interacting with videos to express new perspectives and ideas.

The Tik Tok platform is accessible via a standalone mobile phone application and can also be viewed as a web page from a web browser.
This case is worthy of some more analysis and comment because it is a transparent example of the whole process involved , and it is a long and resource intensive process. The Tik Tok case has been welcomed by GDPR and data protection practitioners GDPRXpert.ie. and many others. Indeed, it has been welcomed by GDPRXpert.ie and fellow GDPR and data protection law experts across the EU. In previous blogs here GDPRXpert.ie have emphasised the length of time it can take to bring a case to a conclusion. The length of time it can take to finalise investigations has been the subject of much criticism by some groups who themselves are familiar with the process and its complexities. They should know better and they most likely do, but they persist in rushing to premature judgment in some cases and have done so very notably in one specific case.


These same groups have their own underlying agenda and their criticism of the DPC may not diminish despite clear robust decisions, such as in the Tik Tok case. One particular group we have criticised before seems to have happily deluded itself into believing it is best placed to monitor and defend the data protection rights of all data subjects. It does this at the cost of neglecting areas within its own more direct remit. Its misguided forays into areas already well served by the DPC has seen it succumb to the desire to take frivolous, and arguably vexatious actions paid for by donors and the Irish taxpayer. In its most recent opportunistic legal venture challenging the DPC in the context of Google’s online bidding, the judge sent a clear message by ordering the group to pay the costs of the DPC. It was seen in this case to be totally out of its intellectual comfort zone and showing signs of a dearth of understanding of data protection law.

The Tik Tok case stands as a testament to the quality of the investigations carried out by the DPC. One must emphasise that the Tik Tok investigation was an own volition inquiry but nevertheless involved the coordinated deployment of huge assets and resources. It may be that the nature of an own volition inquiry makes additional resources necessary because of the extra scrutiny they attract by virtue of the GDPR. These types of cases take time; simple as that. This case does demonstrate the multitude of resources that must be strategically expended in order to bring such complex cases to the desired legal conclusion( was subject of Binding Decision 2/2023 on the dispute submitted by the Irish SA regarding TikTok Technology Limited (Art. 65 GDPR) Adopted on 2 August 2023). These cases are complicated even for data protection experts such as GDPRXPERT.ie

Preliminary Issues
Before the case proper a few mainly procedural issues had to be clarified. The first was whether the DPC was competent to act as the Lead Supervisory Authority at all. Under 4(23) GDPR cross border processing is defined as meaning either :
(a) processing of personal data which takes place in the context of the activities of establishments in more than one Member State of a controller or processor in the Union where the controller or processor is established in more than one Member State;
or
(b) processing of personal data which takes place in the context of the activities of a single establishment of a controller or processor in the Union but which substantially affects or is likely to affect data subjects in more than one Member State.
During the period 29 July 2020 and 31December 2020 ,Tik Tok Technology Ltd (TTL) processed personal data in the context of the activities of a single establishment of a controller or processor in the Union but which substantially affects or is likely to substantially affect data subjects in more than one Member State. TTL’s single establishment in Ireland is supported by affiliated entities in Germany, France, Poland, Italy, Spain and Sweden.
There was no doubt that the processing was cross border.


Turning to the question of whether the DPC was competent to act as lead supervisory authority in respect of the processing under examination, the DPC noted that Article 56(1) GDPR provides that a supervisory authority of the main establishment of a controller or processor shall be competent to act as lead supervisory authority pursuant to Article 60 GDPR.
Having considered all of the above and the nature of the processing at issue, the DPC was satisfied that TTL is a data controller within the meaning of Article 4(7) GDPR regarding the processing which is the subject of the inquiry. The DPC was further satisfied that TTL has its main establishment in Ireland for the purposes of the GDPR. As such, the DPC was satisfied that the requirements of Article 56 GDPR had been met in relation to the processing at issue, such that the DPC is competent to act as the lead supervisory authority in respect of the cross-border processing under examination.
So, the DPC is competent to act as Lead Supervisory Authority (LSA).


The next hurdle to be crossed concerned the argument by TTL that the standards of compliance to which it was being held post dated the relevant period of the inquiry. The argument was that the “Fundamentals of a Child -Oriented Approach to Data Processing” (published Dec. 2021) were not in effect at the time of the processing giving rise to the inquiry and therefore, constituted “an impermissible retrospective application of regulatory standards and a clear breach of fair procedures.”
The DPC dismissed this by relying on the plain fact that GDPR was in force at the time, and something such as the Fundamentals represented ancillary guidance to the GDPR, but TTL was obliged to comply with GDPR since May 2018. The Fundamentals referenced the GDPR principles that were in effect in 2018 and although the Fundamentals were not in effect contemporaneously, they did not constitute any form of retrospective law making. They are post GDPR guidance principles only. The date of their release is immaterial and irrelevant. TTL’s compliance was to be assessed in the light of the GDPR and any guidance notes and material available during the relevant period.
Time for Substantive Issues.
So then, what was the DPC actually investigating in the case?

Material Scope.


This inquiry concerned the processing by TTL of personal data of registered child users of the TikTok platform and whether or not TTL had complied with its obligations under the GPR as data controller. The 2018 Act provides that the term ‘child’ in the GDPR is retaken as a reference to a person under the age of 18 years. TTL provides the TikTok platform to persons over the age of 13. As a result, the term child users in this decision should be taken as a reference to registered TikTok users who are aged between 13 and 17 years old. As set out below, this inquiry also examined certain issues regarding TTL’s processing of personal data relating to children under the age of 13.
In particular, this inquiry concerned two distinct sets of processing operations by TTL in the context of the TikTok platform, both of which constituted the processing of personal data as defined by Article 4(2) GDPR. The inquiry also examined the extent to which TTL complied with its transparency obligations under the GDPR. As highly experienced data protection law consultants, GDPRXpert.ie can testify to the challenges organisations face in meeting transparency standards set by GDPR. The standards are high and will not be met without a strategy being devised and followed. Our GDPR audits often discover that organisations have no strategy at all.
Broadly, the first type of processing to be examined relates to the processing of child users’ personal data in the context of the platform settings of the TikTok platform, both mobile application and website based, in particular published by default processing of such platform sessions in relation to Child Users’ accounts, videos, comments, ‘Duet’ and ‘Stitch’, downloading and ‘Family Pairing’.
The second type of processing to be examined related to the processing by TTL of the personal data of children under the age of 13 in the context of the TikTok platform, both mobile application and website based, in particular for the purposes of age verification.
Finally, with regard to the processing of personal data of persons under the age of 18 in the context of the TikTok platform (including any such processing connection with websites or applications which provide access to the TikTok platform), this inquiry also examined if TTL had complied with its obligations to provide information to data subjects in the form and manner required by Articles 12(1). 13(1)(e), 13(2)(a), 13(2)(b) and 13(2)(f) GDPR.

Assessment of TTL’s Compliance with the GDPR and Corrective Powers.

The Statement of Issues identified the matters for determination as part of the inquiry. Those issues concerned TTL’s compliance with the GDPR ( and consideration of corrective powers), as follows:
Firstly, in relation to platform settings:
• A. Whether, having regard to the default public settings applied to Child Users’ accounts, TTL implemented appropriate technical and organizational measures pursuant to Article 24, GDPR, to ensure and to be able to demonstrate that its processing of Child Users’ personal data was performed in accordance with the GDPR;
• B. Whether, having regard to the default public settings applied to Child Users’ accounts TTL complied with its obligations under Article 5(1)(c) and 25(1) GDPR to ensure that its processing of Child Users’ personal data was adequate, relevant and limited to what is necessary in relation to the purposes for which they were processed; and to implement appropriate technical and organizational measures designed to implement the data minimisation principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this regulation and protect the rights of data subjects;
• C. Whether, having regard to the public default settings applied to Child Users’ accounts, TTL complied with its obligations under Article 25(2) GDPR to implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing were processed; and
• D. Whether, in circumstances where TTL’s platform settings allowed an unverified non Child User to access and control a Child Users’ Platform settings, TTL complied with its obligations under articles 5(1)(f) and 25(1) GDPR to ensure that its processing of Child Users personal data was processed in a manner that ensured appropriate security of the personal data, including protection against unauthorized or unlawful processing and against accidental loss, destruction or damage, using appropriate technical organizational measures; and to implement appropriate technical and organisational measures designed to implement the integrity and confidentiality principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

Secondly in relation to age verification:
• Whether, having regard to TTL’s requirement that users of Tik Tok should be aged 13 and above, TTL complied with its obligations under Art. 24 GDPR to implement appropriate technical and organisational measures to ensure and be able to demonstrate that its processing of data of Child Users was performed in accordance with the GDPR, including by implementing measures to ensure against children under 13 years of age accessing the platform ;

• Whether, having regard to TTL’s requirement that users of TikTok should be aged 13 and above, TTL complied with its obligations under Art.5(1)(b),5 (1)(c) and 25(1)GDPR to ensure that it collected Child Users’ personal data for specified , explicit and legitimate purposes and that it did not further process that data in a manner incompatible with those purposes; to ensure that its processing of Child Users’ personal data was adequate ,relevant and limited to what was necessary in relation to the purposes for which they were processed; and to implement appropriate technical and organisational measures designed to implement the purpose limitation and data minimisation principles in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects , including by implementing measures to ensure against children aged under 13 having access to the platform;

• Whether, having regard to TTL’s requirement that users of TIK Tok should be 13 and over , TTL complied with its obligations under Art. 25 (2) GDPR to implement appropriate technical and organisational measures for ensuring that, by default, only personal data which were necessary for each specific purpose of the processing were processed, including by implementing measures to ensure against children aged under 13 accessing the platform.
Thirdly, in relation to transparency:
• Whether Child Users are appropriately made aware as a user of Tik Tok of the various public and private account settings in accordance with Art.5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f); to be read in conjunction with Recitals 38,39, 58, 60 and 61, and whether Child Users are able to determine the scope and consequences of registering as a user, and specifically that their profile will be defaulted to public.
These were the complicated issues that the DPC had to assess in light of the legal regime.Therefore, the first substantive issue arose in regard to platform default settings and in the specific context of child users.
Issue 1: Assessment and consideration of matters concerning TTL’s compliance with articles 5, 24 and 25 GDPR concerning its platform settings for users under age the age of 18.
The root question was whether TTL had complied with its obligations under Articles 5(1)(c), 5(1)(f), 24 and 25 GDPR. Article 5(1)(c) GDPR provides that personal data shall be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.” Per Recital 39, this requires, in particular, ensuring that the period for which the personal data are stored is limited to a strict minimum. Personal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means. In order to ensure that the personal data are not kept longer than necessary, time limits should be established by the controller for erasure or for a periodic review.
Article 5(1)(f) provides that personal data shall be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures.” Per Recital 39, personal data should be processed in a manner that ensures appropriate security and confidentiality of the personal data, including for preventing unauthorised access to, or use of, personal data and the equipment used for the processing.
Further, Article 24(1) provides: Taking into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation. Those measures shall be reviewed and updated where necessary.
The GDPR and data protection advisory service provided by GDPRXpert.ie always emphasises the importance attached to the technical and organisational measures that are implemented. It is not enough to implement measures; they must be demonstrably ‘effective’. To quote Recital 74 GDPR, “ The responsibility and liability of the controller for any processing of personal data carried out by the controller or on the controller’s behalf should be established. In particular, the controller should be obliged to implement appropriate and effective measures and be able to demonstrate the compliance of processing activities with this Regulation, including the effectiveness of the measures. Those measures should take into account the nature, scope, context and purposes of the processing and the risk to the rights and freedoms of natural persons.” GDPR audits conducted by GDPRXpert.ie will never overlook this potential pitfall. Unfortunately, it is missed by many.

In regard to Art.25, the European Data Protection Board (EDPB) has published Guidelines on Data Protection by Design and Default. These have summarised Art.25 GDPR as follows: “ The core of the provision is to ensure appropriate and effective data protection both by design and by default, which means that controllers should be able to demonstrate that they have the appropriate measures and safeguards in the processing to ensure that the data protection principles and the rights and freedoms of data subjects are effective. ( European Data Protection Board, ‘Guidelines 4/2019 on Article 25 Data Protection by Design and by Default’ (20 October 2020) at [2]).
As with the technical and organisational measures referred to above, it is the effectiveness of the implemented measures that is crucial. Each implemented measure should produce the intended results for the processing foreseen by the controller and this has two consequences as laid out in the EDPB Guidelines:
“First, it means that Article 25 does not require the implementation of any specific technical and organisational measures, rather that the chosen measures and safeguards should be specific to the implementation of data protection principles into the particular processing in question. In doing so, the measures and safeguards should be designed to be robust and the controller should be able to implement further measures in order to scale to any increase in risk. Whether or not measures are effective will therefore depend on the context of the processing in question and an assessment of certain elements that should be taken into account when determining the means of processing. …Second, controllers should be able to demonstrate that the principles have been maintained.”

On a daily basis our data protection law consultants here at GDPRXpert.ie are reminded of the interaction of the necessary congruence of technical and organisational measures that are incorporated into processing operations with data protection principles, especially data minimisation.
Article 25(2) GDPR requires data controllers to implement measures to ensure that, by default, the principle of data minimisation is respected, as follows: “The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.” The obligation to implement the measures described in Art 25(2)GDPR is referred to as Data Protection by Default. By default, personal data must not be made accessible without the individual’s intervention to an indefinite number of natural persons.
As we often remind clients; no specific measures are prescribed by GDPR. If you have a toothache you want to take something to kill that pain but experience will tell you the same medication may not be effective against pain in another location. It is, therefore, controllers who know the appropriate measures that historically have worked. They should assess the risk and implement what will work for an increased risk in the future.

 

 

 

TTL responded to the Art. 5.(1) (c) and submitted that the data minimisation principle did not mandate processing the absolute minimum but rather, what was adequate, relevant and limited to what was necessary in relation to the purposes of the processing and which respected the principle of proportionality.
Their submission on Art.24 was detailed and focused mainly on a few core points. The first was that Arts.24(1) and 24(2) did not impose prescriptive obligations as that would be inconsistent with the objectives of the provisions themselves, which is to embed privacy compliance practices into the internal practices of an organisation in a manner appropriate to the processing activities undertaken by a particular organisation. TTL cited Art.29 Working Party as support for the argument that a one size fits all approach would force controllers into structures that are ‘unfitting and ultimately fail’. Tik Tok took the view that the only option was ‘custom built solutions. By reasonable interpretation, the accountability obligations under Art.24(1) could only be open-ended and non-prescriptive.


TTL also argued that the GDPR does not mandate the exact means of achieving or demonstrating compliance with its requirements. Taking such a prescriptive approach would be inconsistent with the core objective of Art.24 which, as we have just stated, is to embed privacy compliance into the internal practice of organisations in a manner that works for each organisation while remaining aligned with GDPR principles. Data protection consultants GDPRXPERT.ie would agree with TTL that with Art.24 there is a much more holistic approach to data protection compliance and indeed, there has to be. If not, there would have to be a formulaic, universal manner by which an organisation had to display compliance.
We know that under Art.24 GDPR the controllers are required to implement technical and organisational measures to ensure that processing is carried out in accordance with the GDPR and be able to demonstrate such compliance. In order to meet the requirement controllers must make an assessment of (1) the nature, scope , context and purposes and (2) the risks of varying likelihood and severity for the rights and freedoms of natural persons.
TTL contended therefore that there was sufficient leeway within the confines of the Article to allow for some discretion as to what was an appropriate measure. They further argued that it is clear, therefore, that the appropriateness of the measures adopted must be informed by an assessment of the context and purposes of processing, as well as the risks which may result from the processing (if any). As explained above, TikTok’s mission is to inspire creativity and bring joy. The core purpose of the Platform during the Relevant Period was to enable Users to disseminate their own content and to show Users content they are likely to find of interest.
TikTok then moved to their position in relation to Art.25 and stated “ Article 25(1) GDPR does not solely focus on user controlled settings as a technical measure but also addresses technical measures more broadly (including ones that are not user controlled) and organisational measures. As such, TikTok as a data controller is afforded autonomy and appropriate latitude in determining the specific designs of its product.” The measures to be adopted in Article 25(1) GDPR should be commensurate with the risks posed by the processing, and those risks should be weighed by their likelihood and severity.


TTL then looked to opinions from the EDPB to bolster their argument. “The European Data Protection Board (“EDPB”) Article 25 Data Protection by Design and Default Guidelines (“Article 25 Guidelines”) recognise that Article 25(1) GDPR does not envisage a one-size fits all approach to data protection. The EDPB Article 25 Guidelines further state “[w]hen performing the risk analysis for compliance with Articles 25, the controller has to identify the risks to the rights of data subjects that a violation of the principles presents and determine their likelihood and severity in order to implement measures to effectively mitigate the identified risks.”
Article 25(2) states, among other things that “the controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed”. Article 25(2) requires that, by default, only the personal data that is necessary for each specific purpose is processed. It is the responsibility of the controller to define the purpose of the processing, and by doing so, the controller also determines the scope of the processing required for that particular purpose. Article 25(2), therefore, requires implementing default settings to processing that is necessary to achieve the controller’s purpose. Article 25(2) is not prescriptive as to the type of technical and organisational measures that must be implemented to ensure data protection by default. The EDPB has recognised that a range of different measures, including enabling data subjects to intervene in the processing, could be involved “depending on the context and risks associated with the processing in question”.

The context of the processing is central to the consideration as to what measures are appropriate in the given circumstances and to what extent they will implement data protection principles effectively. In particular, Article 25(2) does not require controllers to choose default settings which would subvert the core functionalities of their service. In order to comply with Article 25(1) GDPR, controllers are asked to weigh a multitude of broad and abstract concepts, assess the risks, and then determine “appropriate” measures. Each of these elements is opaque and open to interpretation, and as a result, no two assessments performed in accordance with Article 25 will look the same. Article 25(1) requires “appropriate” measures, which when applied to age verification would mean that a controller is required to implement measures to determine the age of users with an appropriate, rather than absolute, level of certainty.


Article 25(1) GDPR does not prescribe the appropriate technical and organisational measures designed to implement the data protection principles (including the data minimisation principle) that organisations are required to put in place. Controllers are similarly afforded autonomy and appropriate latitude under Article 25(2) GDPR in determining the appropriate measures for ensuring privacy by default. The European Data Protection Board (“EDPB”) in its Article 25 Data Protection by Design and by Default Guidelines (“Article 25 Guidelines”) explains that being appropriate means that the measures and necessary safeguards should be suited to achieve the intended purpose, i.e. they must implement the data protection principles effectively” and that “the controller must verify the appropriateness of the measures for the particular processing in question”.
Further, in considering whether the measures put in place by TikTok complied with Article 25(1) GDPR, account must be taken, in particular, of the “context and purposes of processing”. In this regard, full consideration must be given to the benefits of the relevant features to Users and their importance to the core purpose of TikTok during the Relevant Period as described above, which would have informed younger Users’ expectations, and the measures and privacy settings designed to safeguard younger Users.
All of these submissions had then to be taken into account by the DPC.
The first finding by the DPC
At the time of the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. In this regard, the DPC was of the view that TTL failed to implement appropriate technical and organisational measures to ensure that, by default, only personal data which were necessary for TTL’s purpose of processing were processed. In particular, this processing was performed to a global extent and in circumstances where TTL did not implement measures to ensure that, by default, the social media content of Child Users was not made accessible (without the user’s intervention) to an indefinite number of natural persons. It was held by the DPC that the above processing by TTL was contrary to the principle of data protection by design and default under Article 25(1) and 25(2) GDPR, and contrary to the data minimisation principle under Article 5(1)(c) GDPR.

The second finding by the DPC:
During the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. The above processing posed severe possible risks to the rights and freedoms of Child Users. In circumstances where TTL did not properly take into account the risks posed by the above processing, the DPC took the position that TTL did not implement appropriate technical and organisational measures to ensure that the above processing was performed in accordance with the GDPR, contrary to Article 24(1) GDPR.
The third finding by the DPC
During the Relevant Period, TTL implemented a platform setting – called ‘Family Pairing’ for Child Users whereby a non-Child User could pair their account to that of the Child User. This platform setting allowed the non-Child User to enable direct messages for Child Users above the age of 16. The above processing posed severe possible risks to the rights and freedoms of Child Users. In circumstances where this processing does not ensure appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures; and TTL failed to implement appropriate technical and organisational measures designed to implement the integrity and confidentiality principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects. DPC took the view that this processing was not performed in accordance with the GDPR, contrary to Article 5(1)(f) and Article 25(1) GDPR. ( at p.42)
The DPC also made an assessment and consideration of matters concerning age verification pursuant to articles 24 and 25 GDPR
The fourth finding by the DPC.
During the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. The above processing posed severe possible risks to the rights and freedoms of Child Users. This also posed several possible risks to the rights and freedoms of children under the age of 13 who gained access to the platform. In circumstances where TTL did not properly take into account the risks posed by the above processing to children under the age of 13, the DPC was of the view that TTL did not implement appropriate technical and organisational measures to ensure and to be able to demonstrate that the above processing was performed in accordance with the GDPR, contrary to Article 24(1) GDPR.
The DPC also examined transparency requirements under Arts. 5, 12, and 13 GDPR but at this stage we will not go through all of TTL’s submissions and we will conclude with the transparency obligations and the remaining DPC findings.
The first transparency obligation for consideration was whether Child Users were appropriately made aware (in a concise, transparent, intelligible and easily accessible form, using clear and plain language) by TTL as a user of the TikTok platform of the various public and private account 239. settings in accordance with Articles 5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f) GDPR; to be read in conjunction with Recitals 38, 39, 58, 60 and 61 GDPR, and whether Child Users are able to determine the scope and the consequences of registering as a user, whether public or private.
The second transparency obligation for consideration was whether Child Users were appropriately made aware by TTL as a user of the TikTok platform of the public default setting in accordance with Articles 5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f) GDPR; to be read in conjunction with Recitals 38, 39, 58, 60 and 61 GDPR, and whether Child Users were able to determine the scope and the consequences of registering as a user, and specifically that their profile would be defaulted to public.
Finding 5
In circumstances where TTL did not provide Child Users with information on the categories of recipients or categories of recipients of personal data, DPC found that TTL has not complied with its obligations under Article 13(1)(e) GDPR.
In circumstances where TTL did not provide Child Users with information on the scope and consequences of the public-by-default processing (that is, operating a social media network which, by default, allows the social media posts of Child Users to be seen by anyone) in a concise, transparent and intelligible manner and in a form that is easily accessible, using clear and plain language, in particular insofar as the very limited information provided did not make it clear at all that this would occur, the DPC found that TTL had not complied with its obligations under Article 12(1) GDPR.
Finding 6:
For the reasons established by the EDPB in the Article 65 Decision, TTL infringed the principle of fairness pursuant to Article 5(l)(a) GDPR.
TTL were ordered to bring their processing into compliance, received a reprimand and were fined in total €345 million.
We have done a sort of synopsis of the main issues in this case but it is impossible to fully do justice to the effort on the part of the DPC to uphold and vindicate the rights of data subjects, often achieved despite the criticism levelled at the office from the usual suspects. However, it should give readers a glimpse into the complexity of some of the cases that land on the desk of the DPC.

GPRXPERT.ie offers a comprehensive  data protection consultancy service with particular emphasis on the onerous responsibilities placed on organisations under the GDPR.

Patrick Rowland for GDPRXPERT.ie

www.gdprxpert.ie

Time to get security of processing up to GDPR standard

There is no doubt that the office of the DPC has moved from a  GDPR guidance mode to a GDPR enforcement mode. It is hardly surprising considering the GDPR has now been in effect for over 4 years. This shift in focus  is partly related to the reality that many high profile investigations are largely complete, notwithstanding the likelihood of future appeals that will take up more time. GDPR and data protection law experts, GDPRXPERT.ie, are aware of increased contact from the office of the DPC to organisations of varying size. In general, these contacts are the result of complaints from  members of the public who now seem more knowledgeable about their rights under the GDPR and general data protection law. Undoubtedly, there  has been a consequential increase in complaints to the DPC, most especially emanating from individuals fearing breaches of their rights. An intense and sustained GDPR awareness building programme by the DPC has been very successful, with the result individuals are very knowledgeable concerning their GDPR rights and the responsibilities and obligations of data controllers. Anecdotal evidence suggests that security of personal data processing has been a consistent element of complaints.

The very essence of any right to data protection is that a mechanism must be provided to ensure a data subject’s personal data are   adequately protected. We now know that the GDPR takes a pro-active risk based approach to force the data controller to implement measures, in order  to minimise risks to personal data. It is seen in Art.24(1) GDPR that the controller must implement appropriate organisational and technical measures to ensure and demonstrate compliance with the GDPR. It also stipulates constant review and updating. Data breaches rank high on the risk scale. Art.32(2) GDPR stipulates that in assessing the appropriate level of security  account shall be taken in particular of the risks presented by processing, Risks are always present but,  especially “ from accidental or unlawful destruction,  loss , alteration , unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed”. These risks mandate a level of security appropriate to the harm that might result.

Recital 75 GDPR lays out  what is a comprehensive  summation of the risks that the GDPR hopes to manage. Such a list can not be an exhaustive one as data processing technology is changing and evolving all the time. We know the onus is on the controller (from Art. 24 GDPR) to determine the risks of varying likelihood and severity for the rights and freedoms of natural persons. Various risks to rights and freedoms may result from data processing that leads to physical, material or non material damages. As referred to earlier in the text, Recital 75 sets out a mixture of potential  ultimate consequences. These include, in particular, where the processing may give rise to discrimination, identity theft or fraud, financial loss, damage to reputation, loss of confidentiality  of personal data protected by professional secrecy, unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantages. There is the potential for deprivation of many rights and freedoms and a loss of control over one’s own data. Personal data may be processed and reveal all sorts of sensitive data including health, religion, political opinion, racial or ethnic origin, genetic data and many more.

Determining these risks is aided by reference to the nature, scope, context and purposes of the processing. These criteria ,in conjunction with the knowledge of the available technology and the cost of implementation of up to date technology, aid the controller in deciding what is ‘an appropriate level’ of security. The level of security must also be balanced, in the sense that it is proportionate to the perceived/assessed risks to the rights and freedoms of natural persons. As the risk assessment determines the appropriate level of security to be incorporated into the process,  it must be of a high quality and leave nothing to chance. Two very basic requirements must be met:

  • Any assessment must be objective; and must
  • Take account of the likelihood and severity of the risk; this has to include an assessment of the risks that arise from the personal data processing itself, and any risks that would arise in the case of an actual data breach.

Any objective assessment must at the start incorporate an analysis of the characteristics of the processing activity. Account needs to be taken of, for example;

  • The origin of the data;
  • Nature of the processing activity;
  • The scope of the processing activity
  • The context and purpose of the processing;
  • The likelihood and severity of risks;
  • Purpose of the processing activity ; and
  • the identification of the best technical and organisational measures to mitigate such risks.

Looking at all these aspects will help the data controller establish the level of risk involved in all data processing operations. Best practice would dictate a prudent controller maintains a record that details processing operations, associated risk ( risk levels will be assessed later) and measures taken  to address the risks identified. The objective risk assessment also  forms part of the required information to maintain the record of processing activities under Art. 30 GDPR. Take the example below:

 

Processing Operation
External payroll for employees wages
Associated risk Payroll service might have lax data protection procedures
Measures taken to mitigate risk Assessment on payroll service found high data protection security in place
Conclusion No apparent risk

 

Somewhat ironically, the best way of assessing risks is to look at actual causes of data breaches incidents.  Data controllers strive to avoid data breaches but the fact is they provide  best evidence of the risks inherent in some processing operations! An assessment of these characteristics should contribute to establishing whether the particular data processing operation involves any risk or if there is a risk, whether it is high or low in nature.

The Regulation also recognizes these risks when processing personal data and places the responsibility on the controller and the processor in Art. 32(1) of the General Data Protection Regulation to implement appropriate technical and organisational measures to secure personal data. The GDPR deliberately does not define which specific technical and organisational measures are considered suitable in each case, in order to accommodate individual factors. However, it gives the controller a catalogue of criteria to be considered when choosing methods to secure personal data. Those are the state of the art, implementation costs and the nature, scope, context and purposes of the processing. In addition to these criteria, one always has to consider the severity of the risks to the rights and freedoms of the data subject and how likely those risks could manifest. This basically boils down to the following: The higher the risks involved in the data processing and the more likely these are to manifest, the stronger any security measures that are taken have to be and the more measures must be taken.

(Data controllers and data processors are also obliged to ensure that their staff and “other persons at the place of work” are aware of security measures and comply with them. The legal obligation to keep personal data secure applies to every data controller and data processor, regardless of size. While most of the information below applies to any such organisation, some aspects would only apply to a larger organisation using an IT network.

Issues that data controllers and data processors should consider when developing their security policies: ( much of the information in the next few sections below is from the DPC guidance note ‘Guidance for Controllers on Data Security’ available at www.dataprotection.ie  )

Access control

A data controller has a duty to limit access to personal data on a “need to know” basis. Greater access limitations or controls should apply to more sensitive data. A data controller must be aware of the different users who access their systems/records and their requirements. The different types of users could include:

  • staff at various seniority, operational or responsibility levels;
  • third party contractors/data processors;
  • customers; and
  • business partners

The different requirements of each of these types of users has to be considered. It should not be  a ‘one size fits all’  approach but rather access privileges being directly measured to meet the requirements. The nature of access allowed to an individual user should be set and reviewed on a regular basis. It should go without saying that individual staff members should, among other things, only have access to data which they require in order to perform their duties. Shared credentials (multiple individuals using a single username and passwords)  should not be tolerated. Specific procedures sometimes referred to as a “movers, leavers and joiners” policy are required in all organisations with access to personal data to decide when to maintain, increase or restrict previous access where a user role changes. Access control must be supported by regular reviews to ensure that all authorised access to personal data is strictly necessary and justifiable for the performance of a function.

IT administrator accounts with unrestricted access to personal data warrant special attention. Policies should be in place in regard to vetting and oversight of the staff members allocated these accounts. A staff member with such responsibilities should have separate user and administrator accounts. Multiple independent levels of authentication may be appropriate where administrators have advanced or extra access to personal data or where they have access or control of other’s account or security data.

All organisations big and small must guard against potential downloading of personal data from the organisation’s own systems. This has to be strictly controlled. Such downloading can be blocked by technical means (disabling drives, isolating network areas or segments, etc). Many organisations have taken a decision to block access to USB ports having examined the inherent risks involved in leaving such ports open by default for all users.

Access authentication

Users should have a unique identifier, such as a password, passphrase, smart card, or other token, to allow access to personal data. These are just examples, not an exhaustive list; for example, a biometric (e.g. a fingerprint, voice or retina scan) can also be used as a unique identifier. However, as biometrics in themselves raise serious data protection and privacy issues, their use should only be considered where other authentication methods are demonstrably insufficient.

Passwords

Passwords are a word or string of characters. A strong password should include a minimum of twelve characters (the longer the password, the harder it is for a computer to guess) and may contain one or more of the following:

  • letters (upper and lower case);
  • symbols (e.g. &, *, @, €, $, etc.);
  • numbers ( 0 – 9 ); and
  • punctuation (?, “, !).

However, users should not be required to use a mix of many types of character, as a strong password can be created using only one type of character (e.g. letters) once it is sufficiently long and hard to guess (for computers as well as people). Passwords should be reasonably easy for the user to remember but very difficult for anyone else to guess. Examples might include:

  • M1_s?n, “The_^v1at#r”! (based on ‘My son, “the aviator”!’ with random characters replacing certain vowels or other letters)
  • Te@m5Rb@dp@55word5 (based on ‘Teams are bad passwords’ with numbers and symbols replacing certain letters) Please do not use these examples as actual passwords! Passwords should not contain values known to be commonly-used or expected in passwords, or those which have been compromised. For example, users might be limited from using passwords which include but not limited to:
  • Passwords obtained from previous breaches;
  • Dictionary words;
  • Repetitive or sequential characters (e.g. ‘aaaaaa’, ‘1234abcd’);
  • Context-specific words, such as the name of the service, the username, or derivatives thereof.

Passphrases

Passphrases are similar to passwords, but represent a sentence or sequence of words. They should include twenty characters or more and may also include symbols, numbers and punctuation marks, e.g.

  • “I Love the musical, The Sound of Music 2!”
  • Ilike2swim@thelocalswimingpool

Data controllers should enforce password complexity and length, such as through rules that ensure that weak passwords and reused passwords are rejected. Users should not be required to change their password or passphrase arbitrarily (e.g. too frequently), as this can actually reduce password security (for example, by increasing reliance on simple passwords or reusing passwords). However, users should be required to change their password or passphrase if there is evidence it has been compromised or revealed, or when there is some other change in risk. Data controllers should never store users’ passwords as plain text but should use strong and irreversible cryptographic hashing and salting to protect them and to allow secure checking for login purposes. Data controllers should ensure that users are made aware that their password/passphrase is unique to them and must not be disclosed to anyone else. Shared credentials (where multiple users use the same login and password) should never be permitted. Vendor supplied defaults for system passwords and other security parameters should never be left in place. Data controllers must ensure that partner organisations with access to their systems or personal data respect these controls. Where possible, data controllers should promote password diversity by reminding users of the risks associated with password reuse across other internet services.

Multi-Factor Authentication

Multi-factor authentication (MFA) refers to there being more than one identity factor employed for access authentication. A commonly used option in many services is ‘2FA’, which means that two factors for authentication are used. For example, instead of just using a password of their choosing, a user may have a second factor such as a biometric (e.g. a fingerprint scanner), or an “out-of-band” or alternative communication channel send a passcode to a secondary email address, phone number, or device. It should be noted, however, that some of these secondary channels are more secure than others Devices such as smart cards or tokens, as well as standalone mobile apps, can be used as part of MFA, to provide authentication either by generating a code to be entered or containing a chip that authenticates with the system being accessed. They may generate a PIN number that is valid for a very short period of time. This is used in conjunction with a username and password to authenticate the user, and can reduce the risk of ‘brute force’ password attacks or attacks where passwords have been stolen.

Automatic Screen Savers

Most systems allow for screensavers to activate after a period of inactivity on a computer, requiring a password to re-establish access. This automatic lock activation is useful as the alternative manual locking of a workstation requires positive action by the user every time he/she leaves the computer unattended. Regardless of which method an organisation employs, computers should be locked when unattended. This applies not just to computers in public areas, but to all computers. It is pointless having an access control system in place if unattended computers may be accessed by any staff member, or where a shared password is used.

Encryption.

Encryption as a concept is explicitly mentioned as one possible technical and organisational measure to secure data in the list of Art. 32(1) of the GDPR, which is not exhaustive. Again, the GDPR does not mention explicit encryption methods to accommodate for the fast-paced technological progress. When choosing a method one must also apply the criteria catalogue above. To answer the question of what is currently considered “state of the art” data protection officers usually rely on the definitions set out in information security standards like ISO/IEC 27001 or other national IT-security guidelines.

Companies can reduce the probability of a data breach and thus reduce the risk of fines in the future, if they chose to use encryption of personal data. The processing of personal data is naturally associated with a certain degree of risk. Especially nowadays, where cyber-attacks are nearly unavoidable for companies above a given size. Therefore, risk management plays an ever-larger role in IT security and data encryption is suited, among other means, for these companies. In general, encryption refers to the procedure that converts clear text into a hashed code using a key, where the outgoing information only becomes readable again by using the correct key. This minimises the risk of an incident during data processing, as encrypted contents are basically unreadable for third parties who do not have the correct key. Encryption is the best way to protect data during transfer and one way to secure stored personal data. It also reduces the risk of abuse within a company, as access is limited only to authorised people with the right key. As with passwords, this measure is pointless unless the key to decrypt the data is kept secure.

Encryption of personal data has additional benefits for controllers and/or order processors. For example, the loss of a state of the art encrypted mobile storage medium which holds personal data is not necessarily considered a data breach, which must be reported to the data protection authorities. In addition, if there is a data breach, the authorities must positively consider the use of encryption in their decision on whether and what amount a fine is imposed as per Art. 83(2)(c) of the GDPR.

Anti-Virus Software

Anti-virus software is not only required to prevent infection from the internet (either email or web-sourced) but to prevent viruses that may also be introduced from portable devices, such as memory sticks (the use of which should be strictly limited). No antivirus package will prevent all infections, as they are only updated in response to infections. It is essential that such software is updated on a regular basis and that policies support vigilance in regard to potential threats. A policy of not opening email attachments from unexpected sources can be a useful way of preventing infection.

Firewalls

A firewall is essential where there is any external connectivity, either to other networks or to the internet. It is important that firewalls are properly configured, as they are a key weapon in combating unauthorised access attempts. The importance of firewalls has increased as organisations and individuals avail of “always-on” internet connections, exposing themselves to a greater possibility of attack.

Software Patching

Patches are the latest updates from the creator of your operating system software or application software. They usually contain fixes to potential security concerns and can be an important tool in preventing hacking or malware attacks. Organisations should ensure that they have regular, consistent and comprehensive patch management procedures in place. Where possible, before installing the very latest patches, it is good practice to install these patches in a test environment to ensure that the patches do not create other issues with your systems. A record should also be kept of the date and patch installed on a system.

Remote Access

Where a staff member/contractor is allowed to access the network from a remote location (e.g. from home or from an off-site visit), such access creates a potential weakness in the system, not least when accessed from a wireless network. For this reason the need for such access should be properly assessed and security measures reassessed before remote access is granted. If feasible, the access should be limited to specific IP addresses. Security should be the first consideration in granting access to partner organisations.

Technical security measures, security assessments, contractual agreements in line with the requirements of the GDPR and the Data Protection Act 2018, and agreed standards of management of shared assets are all important aspects in managing this risk. It is the responsibility of the data controller to ensure that, regardless of the means by which a user remotely accesses their system, the security of the system cannot be compromised. Multifactor authentication for such access should be considered in this context.

Wireless Networks

Access to a server by means of a wireless connection can expose a network to attack. The physical environment in which such systems are operated may also be a factor in determining whether weaknesses in the system security exist. As with remote access, wireless networks should be assessed on security grounds rather than solely on apparent ease of use. Data controllers must ensure that adequate security is in place on the network through, for example, appropriate encryption measures or specification of authorised devices.

Particular vulnerabilities are associated with the use of third party unsecured WiFi networks (e.g. those provided in airports, hotels, etc.). A device using such a network may be open to attacks from other machines on the network. A good firewall should be installed on the portable device to prevent such attacks. The device should only connect to the network when necessary. When using unsecured WiFi to transmit personal or sensitive data, a secure web session should be in place to protect the data.

Portable Devices

Laptops, USB keys, smartphones, and other forms of portable device are especially vulnerable to theft and accidental loss. Where a data controller considers it essential to store personal data on a portable device, these devices should be encrypted. Whole disk encryption should be used to mitigate against storage of files outside of an encrypted segment of the disk.

In the case of smartphones, a strong password should be required at start up and also after several minutes of inactivity. When such a device is lost steps should be taken immediately to ensure that the remote memory wipe facility is activated. Staff allocated such devices should be familiar with the relevant procedures.

Logs and Audit Trails

Access control systems and security policies are undermined if the system cannot identify abuses. Consequently, a system should be able to identify the user name that accessed a file and the time of the access. A log of alterations made, along with author / editor, should also be created.

Logs and audit trails can help in the effective administration of the security system and can deter staff members tempted to abuse the system. Staff should be informed that logging is in place and that user logs are regularly reviewed. Monitoring processes should focus not only on networks, operating systems, intruder detection systems and  firewalls, but should include remote access services, web applications and databases. Logging systems can generate lots of information and an automatic means such as a System Information Event Monitor (SIEM) to filter and alert security staff about irregular audit trail entries may assist in its effective use.

An intruder detection system (IDS) acts as an internal alarm system that monitors and reports on malicious activities on a network or system. Such systems also aim to detect attacks that originate from within the system. Any organisation processing large volumes of personal data should have an IDS deployed and activated. Where alerts/events are generated by any such systems there must be a meaningful system in place to examine them in a timely fashion. This is to assist in identifying unusual activity and take immediate corrective action if there is an ongoing breach of security.

Back-Up Systems

A back-up system is an essential means of recovering from the loss or destruction of data. While some system should be in place, the frequency and nature of back up will depend, amongst other factors, on the type of organisation and the nature of data being processed. The security standards for back-up data are the same as for live data.

Incident Response Plans

Even with the best designed systems, mistakes can happen. As part of a data security policy, an organisation should anticipate what it would do if there were a data breach so that it can be ready to respond. Some questions you might ask yourself:

What would your organisation do if it had a data breach incident?

  • Have you a policy in place that specifies what a data breach is? (It is not just lost USB keys/disks/laptops. It may include any loss of control over personal data entrusted to organisations, including inappropriate access to personal data on your systems, or the sending of personal data to the wrong individuals).
  • How would you know that your organisation had suffered a data breach? Does the organisation’s staff (at all levels) understand the implications of losing personal data?
  • Has your organisation specified whom staff tell if they have lost control of personal data?
  • Does your policy make clear who is responsible for dealing with an incident?
  • Does your policy cover the requirements of mandatory breach reporting (where applicable) under the Data Protection Act 2018, the GDPR, and/or the ePrivacy Regulations (SI 336/2011) (including new availability and resilience requirements)?

 

The Human Factor

No matter what technical or physical controls are placed on a system, the most important security measure is to ensure that staff are aware of their responsibilities. Passwords should not be written down and left in convenient places; passwords should not be shared amongst colleagues; and unexpected email attachments should not be opened unless first screened by anti-virus software. Effective employee training about the risks of data compromise, their role in preventing it and how to respond in the event of problems can be a very effective line of defence. Many organisations set security policies and procedures but fail to implement them consistently. Running scenario based training sessions may assist in effective training.

 

Controls focused on individual and organisational accountability and ensuring that policies are carried out are an important part of any system designed to protect personal data. Identify essential controls first and ensure that these controls are implemented across the organisation without exception. Once this is in place, move on to more advanced controls designed to mitigate the risks specific to the organisation and the type(s) of data processed.

Data controllers must have procedures in place to manage staff turnover, including retrieval of data storage devices and quick removal of access permissions.

Many data breaches have a very avoidable cause. It is wise to start looking at the at the simple things before focusing on the more complex.

GDPRXpert.ie are GDPR and data protection law consultants offering expert advice on all aspects of data protection law compliance. Remember what we have just stated. The guidance and education mode from the DPC has changed and moved to enforcement.

Patrick Rowland at www.gdprxpert.ie

Personal Data Breaches and the data controller.

There are very few organisations that  at some stage in the business relationship will not encounter some form of personal data breach and the data controller will have to respond .  Preparing for, and anticipating a  breach, are  the proactive parts. Encountering an actual  breach is only the start.  An active response must be diligent and prudent. This response must include  an integral and strategic risk assessment, leading to  mitigation of those risks in a timely manner. GDPR and data protection consultants, GDPRXpert.ie, have advised extensively on data breaches.

 

Art.12 GDPR defines “a personal data breach as a breach of security leading to the accidental or  unlawful  destruction , loss, alteration, unauthorized disclosure of, or access to, personal data transmitted, stored  or otherwise processed.”  Recital 85, GDPR,  warns that the breach may, if not addressed in an appropriate and  timely fashion, result in physical, material or non-material damage.

Examples abound but include :

  • financial loss (high risk);
  • identity theft (high risk);
  • damage to reputation (high risk);
  • loss of confidentiality of personal data protected by professional secrecy;
  • Fraud and other economic or social disadvantages.

There are three broad types of breaches first outlined by the Art. 29 Working Party. The Working Party was a European Union data protection  advisory group, prior to the European Data Protection Board (EDPB) and the GDPR.

  1. The ‘confidentiality breach‘ where personal data are disclosed or accessed in an unauthorised or unlawful manner;
  2. The’integrity breach‘ where personal data are altered in an unauthorised or unlawful manner;
  3. The ‘availability breach‘where personal data are lost or destroyed in an unauthorised or accidental manner.

Looking at something like “accidental or unlawful destruction” of personal data will lead one to discover that here the data no longer exist . If the data are in existence at all, they no longer exist in a form accessible to the data controller. This is consistent with an ‘availability breach’.

In a  “loss’” scenario the data controller lacks control of , or no longer has access to, the data or the data. Think of ransomware with data encrypted, or the loss of an encryption  key. The personal data are no longer in the possession of the controller at all and so there is an ‘availability breach’.

With “alteration” of data the integrity of the data has been compromised, hence an ‘integrity’ breach.

“Unauthorised disclosure of, or access to, personal data” is most commonly seen where data are disclosed to recipients not authorised to receive such data and the result is a clear ‘confidentiality’ breach.

Whatever the type or form of breach, action is required in a timely fashion.

Under the GDPR two primary obligations are placed upon the controller;

(a)Notification of any personal data breach to the DPC, unless the data controller can demonstrate the breach is unlikely to result in a risk to data subjects;

and (b) communication of that breach to data subjects, where the breach is likely to result in high risk to  data subjects.

GDPRXpert.ie, acting as outsourced DPO ,have conducted  many data breach analyses . This experience leads to the solid conclusion that very few breaches  are the same. However, one aspect that does not change is the breach notification procedure, as this is clearly set out in GDPR and Data Protection Act 2018.

First of all, let’s look at the procedure where the breach is to be notified to the DPC.

Data Breach Notification to the DPC

The controller is compelled to notify the DPC of a personal data breach unless the breach is unlikely to result in  a risk to the rights and freedoms of a natural person. GDPR and data protection consultants GDPRxpert.ie can attest to the tricky subjective nature of the  assessment  of what is ‘unlikely’. Should the decision be that a risk is not unlikely, then the controller has to notify the DPC.

Notification to the DPC has to take place without undue delay and where feasible, no later than 72 hours once the controller has become aware of the breach. In a situation where the DPC is not notified within the 72 hour time frame, reasons for any delay must  be given.

Accountability requirements under Art.5 (2) GDPR will kick in, meaning that the controller in the context of a data breach will have to demonstrate compliance  with the other principles of data processing including Art. 5 (1) (f), ‘integrity and confidentiality’, i.e., “been processed in a manner that ensures appropriate security of personal data”…

On top of this, under Art.33(5) ,controllers must under Art.33(5) document all information relating to the breach so that the DPC can have evidence of their compliance with the notification obligations under Art.33.

A controller should be regarded as having become ‘aware’ of the breach when they have a reasonable degree of certainty that a security incident has occurred and compromised personal data.  Don’t forget, that in order to comply with their obligations under the Article 5(2) (principle of accountability) , as well as the requirement to record relevant information under Article 33(5), controllers should be able to demonstrate to the DPC when and how they became aware of a personal data breach.

Controllers, as part of their internal breach procedures, should  have a system in place for recording how and when they become aware of a breach.

Allied to this is the necessity for being able to show their methodology in assessing the potential risks posed by the breach.

Controllers need to show a coherent  methodology to  explain their decision making.

N.B. It does not mean the DPC will accept it as being sound or reasonable, but the controller will , in all likelihood be seen to have at least been  acting in good faith.

The default position for controllers is that all data breaches should be notified to the DPC, except for those where the controller has assessed the breach as being unlikely to present any risk to data subjects.

The controller MUST show why they reached this conclusion.

 

Documentation should include the details of how the controller assessed the likelihood of risk and severity of risk to the rights and freedoms of the data subject.

In all situations of recognised breaches , even  ones that do not require notification to the DPC, the legal onus is always  on the controller to record at least the basic details of the breach, the assessment thereof, its effects, and the steps taken in response, as required by Article 33(5) GDPR.

(This is often forgotten by controllers and missed by many more. Be careful!!!)

To state the patently obvious ;  to know whether or not  the breach is one that should be notified to the DPC, the controller must first be aware of the data breach itself.  Once aware of the breach, the clock is ticking. As we just touched on, before deciding on  whether there is a  need to notify the DPC concerning a breach,  the controller must make an adequate assessment of the risks posed by the data breach.

This is not an exact science,  but more a judgement process.

In this process there are some factors and particular aspects that demand scrutiny.

The assessment has to be set in the knowledge that there are  risks  that impact negatively  not just  on the right to data protection and privacy , but  often many other rights such as free speech and freedom of movement.

Factors that controllers should take into account when engaging in such an assessment include, but are not limited to:

  • the type and nature of the personal data (including whether it contains sensitive, or ‘special category’ personal data);
  • the circumstances of the personal data breach;
  • whether or not personal data had been protected by appropriate technical protection measures, such as encryption or pseudonymisation;
  • the ease of direct or indirect identification of the affected data subjects;
  • the likelihood of reversal of pseudonymisation or loss of confidentiality;
  • the likelihood of identity fraud, financial loss, or other forms of misuse of the personal data ;
  • whether the personal data could be, or are likely to be, used maliciously;
  • the likelihood that the breach could result in, and the severity of, physical, material or non-material damage to data subjects;
  • and whether the breach could result in discrimination, damage to reputation or harm to data subjects’ other fundamental rights.

Once the controller has made the risk assessment and concludes there is a need to notify the DPC, the notification must at least:

  1. describe the nature of the personal data breach, including, where possible, the categories and approximate number of data subjects concerned and the categories and approximate number of personal data records concerned;
  2. communicate the name and contact details of the data protection officer (DPO) or other contact point where more information can be obtained;
  3. describe the likely consequences of the personal data breach and;
  4.  describe the measures taken or proposed to be taken by the controller to address the personal data breach, including, where appropriate, measures to mitigate its possible adverse effects.

To assist the DPC in assessing compliance with the requirement to notify ‘without undue delay’, as well as the principle of accountability, the DPC recommends that controllers include, in their initial notification, information on how and when they become aware of the personal data breach, along with an explanation for any delay, if applicable.

Where, and in so far as , it is not possible to give all the foregoing information at the same time , the information may be provided in phases without undue further delay.

 

Data Breach Notification to Data Subjects

As referenced earlier, there is also an obligation placed on controllers to notify the data subject of a data breach:

“where that personal data breach is likely to result in a high risk to the rights and freedoms of  natural persons” ( Art.34 (1) GDPR).

Where the risk is immediate and needs to be mitigated  prompt action is required in communicating with the data subject  (See Recital 86).

The need to implement appropriate measures against continuing or similar personal data breaches may justify more time ( Recital 86).

Where there is a need a need for notification to the data subject Art. 34(2) mandates the communication must describe in clear and plain language the nature of the  personal data breach and contain at least ( i.e. at a minimum) the information contained in points (b) (c) and (d) of  Art. 33 (3).

Where a controller has not notified the data subject, the supervisory authority, having considered the likelihood of a high risk resulting from the breach, may either require the controller to communicate a breach or decide that any of  the conditions (a) (b) or  (c) of Art.34(3) outlined below have been met.

  1. The controller has implemented appropriate technical and organisational protection measures, such as rendering the data unintelligible to any person not authorised to access it, e.g. encryption;
  2. The controller has taken subsequent measures so that the high risk is no longer likely to materialise and ;
  3. It would involve disproportionate effort to communicate directly to every data subject. Here a public communication suffices.

In a case where the controller deems it necessary to communicate  the breach to the data subject , the controller will also be communicating it to the DPC.

This is on the logical basis that if it is ‘likely to result in a high risk to the rights and freedoms of natural persons’, (must  notify Data Subject)  by implication,  the same breach cannot be ‘unlikely to result in a risk to rights and freedoms’ ( also  notify DPC).

If it is likely to pose a high risk then  it can hardly be unlikely to pose a risk, which is lower than a high risk. Any  ‘risk’, or ‘risk simpliciter’, as some like to call it, must be of a type that is lower than a ‘high risk’.

There is clearly a higher threshold  for notification to the data subject.

Whilst there is no obligation on controllers to communicate a personal data breach to affected data subjects where it is not likely to result in a high risk to them, controllers are nevertheless free to communicate a breach to data subjects where it may still be in their interests or appropriate to do so anyway, in the context of that particular breach.

While the notification should be made to the data subject as soon  as reasonably feasible,  sometimes it may be advisable  to delay  notification.

A common example is where a controller is made aware a criminal investigation may be pending and early notification may prejudice such an investigation.  In this scenario ,a delay on the advice of law enforcement authorities would be justifiable.

Once it becomes clear that any notification is no longer prejudicial to an investigation ,  the data subject should be promptly informed.

We have seen  earlier that once the breach has been detected and  the risks assessed the controller may be obliged to notify the DPC and the data subject.

This depends most of all on  the conclusion reached after the risk assessment.

We also looked earlier  at some factors to be taken into account when conducting the risk assessment.

The  risk assessment has to be an objective assessment.

It must judge the severity and likelihood of the risks.

As part of the ePrivacy Directive , the EU Agency for Network and Information Security (ENISA) produced recommendations for a data  breach severity  assessment.

Within this, the severity of three different factors  is to be considered.

Assessing the severity of the risk.

Factor A: The type of data that was breached can have a value of 1-4;

Factor Y: The ease with which a data subject can be identified is assigned a value of 1 or lower ;

Factor Z: The specific circumstances of the breach  can have a value of 0.5 or lower.

No assessment modality has yet been adopted for  the GDPR, but this method is a useful guide to help quantify Risk Severity.

Despite all this, any  risk assessment  remain more of an art than a science.

(Consequently, a key element of any data security policy is being able, where possible, to prevent a breach and, where it nevertheless occurs, to react to it in a timely manner).

 

Record Keeping Obligations.

Regardless of whether or not a breach needs to be notified to the supervisory authority, the controller must keep documentation of all breaches, as Article 33(5) explains: “The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.” Therefore,  controllers must bear in mind the onus is on them to ensure that they continue to document how any personal data breaches that arise are dealt with.

This is linked to the accountability principle of the GDPR, contained in Article 5(2). The purpose of recording non-notifiable breaches, as well notifiable breaches, also relates to the controller’s obligations under Article 24, and the supervisory authority can request to see these records. Controllers are therefore encouraged to establish an internal register of breaches, regardless of whether they are required to notify or not.

Whilst it is up to the controller to determine what method and structure to use when documenting a breach, in terms of recordable information there are key elements that should be included in all cases. As is required by Article 33(5), the controller needs to record details concerning the breach, which should include its causes, what took place and the personal data affected. It should also include the effects and consequences of the breach, along with the remedial action taken by the controller.

The old Art. 29 WP guidelines recommend that the controller also document its reasoning for the decisions taken in response to a breach. In particular, if a breach is not notified, a justification for that decision should be documented. This should include reasons why the controller considers the breach is unlikely to result in a risk to the rights and freedoms of individuals . Alternatively, if the controller considers that any of the conditions in Article 34(3) are met, then it should be able to provide appropriate evidence that this is the case.

(The conditions under Art. 34(3) are those that make notification to data subjects unnecessary, as seen above earlier)

Where the controller does notify a breach to the supervisory authority, but the notification is delayed, the controller must be able to provide reasons for that delay; documentation relating to this could help to demonstrate that the delay in reporting is justified and not excessive.

Where the controller communicates a breach to the affected individuals, it should be transparent about the breach and communicate in an effective and timely manner. Accordingly, it would help the controller to demonstrate accountability and compliance by retaining evidence of such communication.

To aid compliance with Articles 33 and 34, it would be advantageous to both controllers and processors to have a documented notification procedure in place, setting out the process to follow once a breach has been detected, including how to contain, manage and recover the incident, as well as assessing risk, and notifying the breach.

 

In this regard, to show compliance with GDPR it might also be useful to demonstrate that employees have been informed about the existence of such procedures and mechanisms and that they know how to react to breaches. It should be noted that failure to properly document a breach can lead to the supervisory authority exercising its powers under Article 58 and, or imposing an administrative fine in accordance with Article 83.

Much of the foregoing information is also available on this link at the DPC website.

The DPC has also recently updated a data breach notification form.

At the same site you will find useful tips on avoiding data breaches.

 

Here at GDPRXpert.ie we are  GDPR and data protection consultants with vast expertise helping businesses first fully recognise, and then properly react to , data breaches .

GDPRXpert.ie are located in Carlow/Kilkenny and Mayo, offering a  nationwide service.

Call 0858754526 or 0599134259 to discuss your particular need.

Patrick Rowland, GDPRXpert.ie

Data Protection Impact Assessments-yes or no?

A Data Protection Impact Assessment ( DPIA) is one  of the most responsible tasks that, in certain circumstances,  is prescribed under the GDPR. Non compliance with DPIA requirements can lead to the imposition of fines by the DPC. Any reputable data protection consultancy should have qualified, certified and experienced data protection professionals available to carry out DPIAs on your behalf. At GDPRXpert.ie we routinely undertake DPIAs as part of our services. This service is available nationwide. Data protection consultants GDPRXpert.ie have found that even  in the cases where a DPIA is not mandatory, it is always an advisable course of action.

What is a data protection impact assessment?

A Data Protection Impact Assessment is a process specifically designed to identify, quantify and mitigate the risks involved in the processing operation. It does this primarily by assessing the necessity and proportionality of the processing and putting a strong emphasis on managing the risks to the rights and freedoms of all natural persons resulting from the processing of personal data. Therefore, an essential ingredient in any DPIA mix is a measured assessment of the risks to those rights and freedoms, and a determination of the appropriate measures to address them.

At the heart of the DPIA is its role as a conduit of accountability that works to enable controllers to comply with their requirements under GDPR. By using this accountability tool a controller can demonstrate that all appropriate measures have been taken to ensure compliance with the Regulation. In essence, the DPIA is the building block to construct and demonstrate compliance. Data protection consultants GDPRXpert.ie will provide the foundation for you to build and construct a compliant business  structure.

DPIA Content

Article 29 Working Party elaborates on the details. The GDPR does not formally define the concept of a DPIA as such, but – its minimal content is specified by Article 35(7) as follows:

“(a) a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;

(b) an assessment of the necessity and proportionality of the processing operations in relation to the purposes;

(c) an assessment of the risks to the rights and freedoms of data subjects ;

and (d) the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data, and to demonstrate compliance with this Regulation, taking into account the rights and legitimate interests of data subjects and other persons concerned”.

Recital 84 goes on to clarify the role in the following terms; “In order to enhance compliance with this Regulation where processing activities are likely to result in a high risk to the rights and freedoms of natural persons, the controller should be responsible for the carrying-out of a data protection impact assessment to evaluate in particular, the origin, nature, particularity and severity of that risk”.

The same Recital continues; “The outcome of the assessment should be taken into account when determining the appropriate measures to be taken in order to demonstrate that the processing of personal data complies with this Regulation”.

 

Is a Data Protection Impact Assessment Mandatory?

 

A DPIA is not mandatory for every personal data processing operation. Indeed, the risk-based approach inherent in the GDPR requires only that a DPIA be carried out when the processing is “likely to result in a high risk to the rights and freedoms of natural persons” (Article 35(1)). There is no necessity for a certainty, but inherently high risk should attract more scrutiny. Article 35 (3) states a DPIA “shall in particular be required in the case of:

“(a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person ;

(b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10 ;

or (c) a systematic monitoring of a publicly accessible area on a large scale”.

The words above, ‘in particular’ (in bold), from Art.35 (3), signify that the list is deliberately non-exhaustive. One practical consequence is that there will be cases that do not fall neatly into any ‘high risk’ category, but yet they pose a quantifiably high risk.  To make the assessment on whether a DPIA is mandatory or not, in itself, involves a risk assessment or a sort of mini DPIA. What is ‘likely to result in high risks…?’ How is the ‘high risk’ to be assessed?

Recital 84 places emphasis on evaluating the ‘origin, nature, particularity and severity of the risk.’ A general backdrop to the high risk potential includes aspects such as the nature, the context, the scope and the purposes of the processing. Prudent advice from the Art. 29 WP Guidelines is that where it is not clear whether a DPIA is required, a DPIA should nonetheless be carried out to help data controllers comply with data protection law.

Some Other Criteria for A DPIA.

There is then, what might be called, ‘an assessment before an assessment’. Art. 35(4) envisages the establishment of a list of processing operations that would guide controllers in their scrutiny of operations that may require a DPIA. Art. 29 WP lays out the relevant criteria to be considered in this regard:

  1. Evaluation or scoring, including profiling and predicting, especially from “aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements” (Recitals 71 and 91). Examples of this could include a bank that screens its customers against a credit reference database, or a biotechnology company offering genetic tests directly to consumers in order to assess and predict the disease/health risks, or a company building behavioural or marketing profiles based on usage or navigation on its website;
  2. Automated-decision making with legal or similar significant effect: processing that aims at taking decisions on data subjects producing “legal effects concerning the natural person” or which “similarly significantly affects the natural person” (Article 35(3)(a)). For example, the processing may lead to the exclusion or discrimination against individuals. Processing with little or no effect on individuals does not match this specific criterion;
  3. Systematic monitoring: processing used to observe, monitor or control data subjects, including data collected through “a systematic monitoring of a publicly accessible area” (Article 35(3) (c)). This type of monitoring is a criterion because the personal data may be collected in circumstances where data subjects may not be aware of who is collecting their data and how they will be used. Additionally, it may be impossible for individuals to avoid being subject to such processing in frequent public (or publicly accessible) space(s);
  4. Sensitive data: this includes special categories of data as defined in Article 9 (for example information about individuals’ political opinions), as well as personal data relating to criminal convictions or offences. An example would be a general hospital keeping patients’ medical records or a private investigator keeping offenders’ details. This criterion also includes data which may more generally be considered as increasing the possible risk to the rights and freedoms of individuals, such as electronic communication data, location data, financial data (that might be used for payment fraud). In this regard, whether the data has already been made publicly available by the data subject or by third parties may be relevant. Where  personal data is publicly available, this aspect   may be considered as a factor in the assessment if the data was expected to be further used for certain purposes. This criterion may also include information processed by a natural person in the course of purely personal or household activity (such as cloud computing services for personal document management, email services, diaries, e-readers equipped with note-taking features, and various life-logging applications that may contain very personal information), whose disclosure or processing for any other purpose than household activities can be perceived as very intrusive;
  5. Data processed on a large scale: the GDPR does not define what constitutes large-scale, though Recital 91 provides some guidance. In any event, the WP29 recommends that the following factors, in particular, be considered when determining whether the processing is carried out on a large scale:

(a)the number of data subjects concerned, either as a specific number or as a proportion of the relevant population;

(b)the volume of data and/or the range of different data items being processed;

(c)the duration, or permanence, of the data processing activity;

(d)the geographical extent of the processing activity.

6.Datasets that have been matched or combined, for example originating from two or more data processing operations performed for different purposes and/or by different data controllers in a way that would exceed the reasonable expectations of the data subject.;

(7)Data concerning vulnerable data subjects (Recital 75): the processing of this type of data can require a DPIA because of the increased power imbalance between the data subject and the data controller, meaning the individual may be unable to consent to, or oppose, the processing of his or her data. For example, employees would often meet serious difficulties to oppose to the processing performed by their employer, when it is linked to human resources management. Similarly, children can be considered as not able to knowingly and thoughtfully oppose or consent to the processing of their data. This also concerns more vulnerable segment of the population requiring special protection, such as, for example, the mentally ill, asylum seekers, or the elderly, a patient, or in any case, where an imbalance in the relationship between the position of the data subject and the controller can be identified;

(8)Innovative use or applying technological or organisational solutions, like combining use of finger print and face recognition for improved physical access control, etc. The GDPR makes it clear (Article 35(1) and Recitals 89 and 91) that the use of a new technology can trigger the need to carry out a DPIA. This is because the use of such technology can involve novel forms of data collection and usage, possibly with a high risk to individuals’ rights and freedoms. Indeed, the personal and social consequences of the deployment of a new technology may be unknown. A DPIA will help the data controller to understand and to treat such risks. For example, certain “Internet of Things” applications could have a significant impact on individuals’ daily lives and privacy; and therefore require a DPIA;

(9)Data transfer across borders outside the European Union (Recital 116), taking into consideration, amongst others, the envisaged country or countries of destination, the possibility of further transfers or the likelihood of transfers based on derogations for specific situations set forth by the GDPR; ( put in link to my article )

(10)When the processing in itself “prevents data subjects from exercising a right or using a service or a contract” (Article 22 and Recital 91). This includes processing performed in a public area that people passing by cannot avoid, or processing that aims at allowing, modifying or refusing data subjects’ access to a service or entry into a contract. An example of this is where a bank screens its customers against a credit reference database in order to decide whether to offer them a loan. The WP29 considers that the more criteria are met by the processing, the more likely it is to present a high risk to the rights and freedoms of data subjects, and therefore to require a DPIA.

As a rule of thumb, a processing operation meeting less than two criteria may not require a DPIA due to the lower level of risk, and processing operations which meet at least two of these criteria will require a DPIA. However, in some cases, a processing meeting only one of these criteria will require a DPIA. Conversely, if the controller believes that despite the fact that the processing meets at least two criteria, it is considered not to be “likely high risk”, he has to thoroughly document the reasons for not carrying out a DPIA. In addition, a data controller subject to the obligation to carry out the DPIA “shall maintain a record of processing activities under its responsibility”( Art. 30 (1), including, inter alia, the purposes of processing, a description of the categories of data and recipients of the data and “where possible, a general description of the technical and organisational security measures referred to in Article 32(1)”, and must assess whether a high risk is likely, even if they ultimately decide not to carry out a DPIA.

Note: supervisory authorities are required to establish, make public and communicate a list of the processing operations that require a DPIA to the European Data Protection Board (EDPB) (Article 35(4)). The criteria set out above can help supervisory authorities to constitute such a list, potentially with more specific content added in time if appropriate. For example, the processing of any type of biometric data or that of children could also be considered as relevant for the development of a list pursuant to Article 35(4).

The DPC has issued guidelines on processing operations that require a DPIA. Where a documented screening or preliminary risk assessment indicates the processing operation is likely to result in a high risk to the rights and freedoms of individuals pursuant to Art.35 (1) the DPC has determined a DPIA will also be mandatory for the following types of processing operations:

1) Use of personal data on a large-scale for a purpose(s) other than that for which it was initially collected pursuant to GDPR Article 6(4);

2) Profiling vulnerable persons including children to target marketing or online services at such persons;

3) Use of profiling or algorithmic means or special category data as an element to determine access to services or that results in legal or similarly significant effects;

4) Systematically monitoring, tracking or observing individuals’ location or behaviour;

5) Profiling individuals on a large-scale;

6) Processing biometric data to uniquely identify an individual or individuals or enable or allow the identification or authentication of an individual or individuals in combination with any of the other criteria set out in WP29 DPIA Guidelines;

7) Processing genetic data in combination with any of the other criteria set out in WP29 DPIA Guidelines;

8) Indirectly sourcing personal data where GDPR transparency requirements are not being met, including when relying on exemptions based on impossibility or disproportionate effort;

9) Combining, linking or cross-referencing separate datasets where such linking significantly contributes to or is used for profiling or behavioural analysis of individuals, particularly where the data sets are combined from different sources where processing was/is carried out for difference purposes or by different controllers;

10) Large scale processing of personal data where the Data Protection Act 2018 requires “suitable and specific measures” to be taken in order to safeguard the fundamental rights and freedoms of individuals. This list does not remove the general requirement to carry out proper and effective risk assessment and risk management of proposed data processing operations nor does it exempt the controller from the obligation to ensure compliance with any other obligation of the GDPR or other applicable legislation. Furthermore, it is good practice to carry out a DPIA for any major new project involving the use of personal data, even if there is no specific indication of likely high risk.(From DPC Guidelines available here).

Ultimate responsibility rests with the controller, as it is the controller who must decide whether or not a ‘high risk’ exists.  Such a decision must take a host of factors into account. When two or more of these factors combine in the processing operation, the risk is sure to increase. For example, a processing operation could involve new technology, the processing of sensitive data and profiling/evaluation. The factors are not prescriptive but the office of the DPC has identified some that warrant special attention.

These factors include:

  • Uses of new or novel technologies;
  • Data processing on a large scale;
  • Profiling/Evaluation – Evaluating, scoring, predicting of individuals’ behaviours, activities, attributes including location, health, movement, interests, preferences;
  • Any systematic monitoring, observation or control of individuals including that taking place in a public area or where the individual may not be aware of the processing or the identity of the data controller;
  • Processing of sensitive data including that as defined in GDPR Article 9, but also other personally intimate data such as location and financial data or processing of electronic communications data;
  • Processing of combined data sets that goes beyond the expectations of an individual, such as when combined from two or more sources where processing was carried out for different purposes or by different data controllers;
  • Processing of personal data related to vulnerable individuals or audiences that may have particular or special considerations related to their inherent nature, context or environment. This will likely include minors, employees, mentally ill, asylum seekers, the aged, those suffering incapacitation;
  • Automated decision making with legal or significant effects (see below). This includes automatic decision making where there is no effective human involvement in the process; and
  • Insufficient protection against unauthorised reversal of pseudonymisation.

Under Art. 35(5) it is open to any Supervising Authority, in our case the DPC, to set out a list of the kind of processing operations for which no data protection impact assessment is required. A definitive list pursuant to Art. 35(5) has not been issued by the DPC. A general rule is that any processing that is not ‘likely to result in a high risk to the rights and freedoms of natural persons’ will be exempt from a DPIA.  However, deciding what is, ‘likely to result in a high risk…’ demands the carrying out of a ‘mini DPIA’. Despite the absence of a comprehensive definitive list, the office of the DPC, in a publication on DPIAs, lays out some examples of processing operations not requiring a DPIA:

A previous DPIA was carried out and found no risk;

Processing has been authorised by the DPC;

Processing is being done pursuant to (c) or (e) of Art. 6(1) of the GDPR. Point (c) refers to processing necessary for compliance with a legal obligation. Point (e) refers to processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. In both cases there must be a clear legal basis under EU or Member State law AND a DPIA must have already been conducted under Art. 35(10).

On balance it is advisable to have a Data Protection  Impact Assessment carried out. In many cases, the minimum content of the assessment as set out under Art.35 (a) to (d) GDPR will be sufficient to ensure compliance and bring peace of mind to an  organisation conducting the processing operations.

Here at GDPRXpert.ie we are  GDPR and data protection consultants with vast expertise in conducting DPIAs.

GDPRXpert.ie are located in Carlow/Kilkenny and Mayo, offering a  nationwide service.

Call 0858754526 0r 0599134259 to discuss your particular need.

Patrick Rowland, GDPRXpert.ie

 

ePrivacy , GDPR and cookies.

Most people have had concerns other than ePrivacy , GDPR and cookies over  these past few months.  Over this same period and longer, this data protection blogger’s waistline has increasingly  hinted at affection for cookies of the baked variety. To be more accurate: it is evidence of such affection. It stands as an affection that, whilst not a direct consequence of the pandemic, nevertheless has been predominantly encouraged, facilitated and maximised by this global pandemic. We all need some kind of excuse!

During the last few months thoughts around the architecture of other cookie types have also taken on a more reflective mode. These cookies are the cookies we encounter when we browse the internet. Some announce themselves immediately; others seek to delay announcing their presence, while more adopt a strategy of hide and seek. In the context of data protection generally, and GDPR more specifically, this raises many concerns. As data protection consultants , many queries we receive relate to uncertainties and doubts about cookies that have made their way onto user devices.
This blog will outline the data protection ramifications for the proliferation of the various cookie types.  It will also  outline the associated compliance requirements for their use under the GDPR and the old ePrivacy Directive, which was given effect in Ireland by the European Communities Privacy and Electronic Communications Regulations, S.I. 336 of 2011 . There is still no definitive answer as to when the long awaited EU ePrivacy Regulation will be finally agreed upon and become law.

What are Cookies?
Many will be all too familiar with the question above, and the ubiquitous presence of cookies on the internet, but still how many truly understand or bother to learn more? How many times have we all as internet surfers just ignored cookie notices and clicked ‘accept boxes’ in order to quickly access the information we are seeking? Definitions of cookies abound and vary. The aim of  this blog post is to  offer clear   information in a manner that  avoids  the strictly technical descriptions, especially those that often confuse more than they enlighten or explain.

Cookies are small pieces of information, stored in simple text files, placed on your computer by a website. These cookies can then be read by the website on your subsequent visits. Some of the information stored in a cookie may relate to your browsing habits on the web page, or a unique identification number so that the website can ‘remember’ you on your return visit. In general, cookies do not contain personal information from which you can be identified, unless you have specifically furnished such information to the website.
Historically, cookies were conceived to make up for the web protocols’ inability to record preferences (e.g. languages) or actions already performed on website (such as the articles already in the shopping basket of an ecommerce website). Later on, their use was extended to enabling user authentication during a session, recording browsing behaviour for web service improvement purposes, or for tracking and profiling users, e.g. to serve targeted advertising.

For example, Google Analytics is Google’s analytics tool that helps website and app owners to understand how their visitors engage with their properties. It may use a set of cookies to collect information and report site usage statistics without personally identifying individual visitors to Google.
Most commonly, cookies store user preferences and other information. We will see later how much of the information is aggregated and anonymised so as not to readily identify individual users. However, it is not quite as simple as that, and with some cookie types there are hidden dangers for a person’s privacy and data protection rights.
Cookie Classifications.
Duration
Some cookies are defined by the length of time they remain active. For example cookies are most often referred to as ‘session’ or ‘persistent’. ‘Session’ cookies are most usually stored temporarily during a browsing session and are deleted from the user’s device when the browser is closed. ‘Persistent’ cookies are saved on your computer for a fixed period, and are not deleted when the browser is closed, allowing them to conveniently and quickly manage return visits to a website. (Consent is not required by the law when cookies are used to enable the communication on the web and when they are strictly necessary for the service requested by the user.)
Source/Origin
Third Party cookies
These cookies may be set through a site by that site’s advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites.
They do not directly store personal data, but act to uniquely identify your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising. For example, many website owners use social media platforms, such as Facebook, Twitter or Instagram et cetera, as tools to enhance their own website, but these cookies are set by the social media companies.


Primarily, such cookies are set by third parties for their own specific purposes and such purposes are predominantly purely of a commercial nature, e.g. advertising. Such third party advertising cookies may be used by those third parties to anonymously target advertising to you on other websites, based on your previous visit to an entirely unrelated website. These cookies cannot be used to identify an individual; they can only be used for statistical purposes, for example, in providing you with advertisements that are more relevant to your interests.
One of the reasons Facebook’s ads are so successful is that they track and target the user across websites. If you’re an advertiser, third party cookie data allows you to learn about your web visitors’ overall online behaviours, such as websites they frequently visit, purchases, and interests that they’ve shown on various websites. Users are being tracked across the entire  web within a specific browser, and not just on the site on which cookies might have been installed. With this detailed data, you can build robust visitor profiles. Armed with  this, you can then create a retargeting list that can be used to send ads to your past visitors or people with similar web profiles.

First party cookies

These are the opposite of third party cookies. First party cookies are directly stored by the website or domain you visit. They are set and controlled by a website itself and are set with the purpose of giving the website information about the usage of its site. The usual goal is to allow the website owner provide as good a user experience as possible. With that in mind, the cookies can collect analytics data, remember language settings, and perform a myriad of useful functions. The architecture and design of modern websites is focused on optimising the ability to gain as much (relevant) information as possible about the operation of specific websites. Functional, performance and targeting cookies may be first party cookies, but not in all cases. First party cookies are directly stored by the website or domain you visit.
A first-party cookie is a code that gets generated and stored on a website visitor’s computer by default when they visit the same website. This cookie is often used for user experience as it is responsible for remembering passwords, basic data about the visitor, and other preferences. With a first-party cookie, you can learn about what a user did while visiting your website, see how often they visit it, and gain other basic analytics that can help you develop or automate an effective marketing strategy around them. However, you can’t see data related to your visitor’s behaviour on other websites that aren’t affiliated with your domain.

Purpose

Strictly necessary cookies — These cookies are essential for you to browse the website and use its features, such as accessing secure areas of the site. Cookies that allow web shops to hold your items in your cart while you are shopping online are an example of strictly necessary cookies. These cookies will generally be first-party session cookies. No opt outs are permissible on these cookies, and as they are necessary for the proper functioning of the website the user does not have to consent to these. It is a ‘take them or leave them’ type of choice. While it is not required to obtain consent for these cookies, what they do and why they are necessary should be explained to the user.

Preferences cookies — Also known as ‘functionality cookies’, these cookies allow a website to remember choices you have made in the past, like what language you prefer, what region you would like weather reports for, or what your username or password is so you can automatically log in. These cookies, therefore, enable a website to provide enhanced functionality and personalisation. They may be set by a website or by third party providers whose services a website adds to its pages. No information is gathered or stored unless you interact with these features.

Statistics cookies — Also known as ‘performance cookies’,  these cookies collect information about how you use a website, like which pages you visited and which links you clicked on. None of this information can be used to identify you. It is all aggregated and, therefore, anonymized. Their sole purpose is to improve website functions. This includes cookies from third-party analytics services as long as the cookies are for the exclusive use of the owner of the website visited. Blocking these cookies means the site will not know when you have visited the site, and will not be able to monitor its performance.

Marketing cookies — These cookies track your online activity to help advertisers deliver more relevant advertising or to limit how many times you see an ad. These cookies can share that information with other organizations or advertisers. These are persistent cookies and almost always of third-party provenance. When people complain about the privacy risks presented by cookies, they are generally speaking about third-party, persistent, marketing cookies.
These are the main ways of classifying cookies, although there are cookies that will not fit neatly into these categories or may qualify for multiple categories. When people complain about the privacy risks presented by cookies, they are generally speaking about third-party, persistent, marketing cookies. These cookies can contain significant amounts of information about your online activity, preferences, and location. The chain of responsibility (who can access a cookies’ data) for a third-party cookie can get complicated as well, only heightening their potential for abuse.

Related/Used in conjunction with cookies
Browser web Storage: Browser web storage enables websites to store data in a browser on a device. When used in ‘local storage’ mode, it enables data to be stored across sessions. This makes data retrievable even after a browser has been closed and reopened. One technology that facilitates web storage is HTML 5.
IP address
Every device connected to the Internet is assigned a number known as an Internet protocol (IP) address. These numbers are usually assigned in geographic blocks. An IP address can often be used to identify the location from which a device is connecting to the Internet.
Pixel tag
A pixel tag is a type of technology placed on a website or within the body of an email for the purpose of tracking certain activity, such as views of a website or when an email is opened. Pixel tags are often used in combination with cookies
Referrer URL
A Referrer URL (Uniform Resource Locator) is information transmitted to a destination webpage by a web browser, typically when you click a link to that page. The Referrer URL contains the URL of the last webpage the browser visited.
Server logs
Most websites automatically record the page requests made when you visit those particular sites. These ‘server logs’ typically include your web request, Internet Protocol address, browser type, browser language, the date and time of your request, and one or more cookies that may uniquely identify your browser.
Google gives an example below of how a simple search for “cars” might look like as a server log file.
A typical log entry for a search for “cars” looks like this:
123.45.67.89 – 25/Mar/2003 10:15:32 –
http://www.google.com/search?q=cars –
Firefox 1.0.7; Windows NT 5.1 –
740674ce2123e969
• 123.45.67.89 is the Internet Protocol address assigned to the user by the user’s ISP. Depending on the user’s service, a different address may be assigned to the user by their service provider each time they connect to the Internet.
• 25/Mar/2003 10:15:32 is the date and time of the query.
• http://www.google.com/search?q=cars is the requested URL, including the search query.
• Firefox 1.0.7; Windows NT 5.1 is the browser and operating system being used.
• 740674ce2123a969 is the unique cookie ID assigned to this particular computer the first time it visited Google. (Cookies can be deleted by users. If the user has deleted the cookie from the computer since the last time they’ve visited Google, then it will be the unique cookie ID assigned to their device the next time they visit Google from that particular device).

Unique identifiers

A unique identifier is a string of characters that can be used to uniquely identify a browser, app, or device. Different identifiers vary in how permanent they are, whether they can be reset by users, and how they can be accessed.

Unique identifiers can be used for various purposes, including security and fraud detection, syncing services such as an email inbox, remembering your preferences, and providing personalized advertising. For example, unique identifiers stored in cookies help sites display content in your browser in your preferred language. You can configure your browser to refuse all cookies or to indicate when a cookie is being sent.

Unique identifiers may also be incorporated into a device by its manufacturer (sometimes called a universally unique ID or UUID), such as the IMEI-number of a mobile phone. For example, a device’s unique identifier can be used to customize a website’s service to your device or analyze device issues related to that service.

Some Cookie Problems.
There is a close relationship between ePrivacy and cookies. Cookies do not exist in a vacuum, but rather interact in both negative and positive manners with internet users on a daily basis. This interaction is regulated by both the ePCR and the GDPR. The ePCR supplements (and in some cases, overrides) the GDPR, addressing crucial aspects about the confidentiality of electronic communications and the tracking of internet users more broadly.
One aspect that overrides all others in the day to day interactions is the concept of consent. It is the absence of any cookie consent mechanism, or the dilution of the consent mechanism, that primarily causes problems. While cookies are specifically regulated under the auspices of the ePCR, consent in relation to cookies is governed by the very strict standards under the GDPR. Under the ePCR consent is required to either store information, or gain access to information stored, on an individual’s device. This is the essence of the cookies rules. The two exceptions to this general rule are where:
1. The sole purpose is for carrying out the transmission of a communication, or
2. It is strictly necessary in order to provide an online service explicitly requested by that individual
These rules apply regardless of whether personal data is processed. EU law aims to protect the user from any interference with his or her private life, in particular, from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge.

In a report in 2019, the DPC noted that many controllers categorised the cookies deployed on their websites as having a‘necessary’ or ‘strictly necessary’ function, where the stated function of the cookie appeared to meet neither of the two consent exemption criteria set down in the ePrivacy Regulations/ePrivacy Directive. These included cookies used to establish chatbot sessions that were set prior to any request by the user to initiate a chatbot function. In some cases, it was noted that the chatbot function on the websites concerned did not work at all”.
The GDPR speaks more to the nature of the concept of consent. It is specific as to the quality of consent, as it is the quality that sets it apart.
The Notion of Consent under the GDPR.
One of the most common problems alluded to above is the absence of, or the strategic manipulation of, the nature and essence of consent. The GDPR always foresaw how the promulgation of many of its objectives was to be inextricably linked to a full understanding of the attendant concept of consent. Trample upon or weaken the nature of true consent and you simultaneously trample upon data subject rights.
Article 4 (11) GDPR
The article defines consent of the data subject as ‘ any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her’. We know  in the context of the ePCR that the data being processed does not necessarily have to be personal data. Nevertheless, the standard of consent is the GDPR standard. Art. 7 GDPR expands on the concept and gives the general conditions to be met to validate consent.
At the outset where a controller is using consent as a basis for processing that controller has to be able to demonstrate that the data subject has indeed consented (Art.7 (1) GDPR).

Where consent is given in the context of a written declaration and is juxtaposed or mixed in with other items, any request for consent has to be clearly distinguishable from those other items (Art. 7 (2) GDPR).
Consent may be withdrawn at any time. Even before consent is given, the data subject must be informed it shall be as easy to withdraw as to give consent. Any withdrawal of consent subsequent to the processing of personal data shall not affect the lawfulness of processing prior to the withdrawal of consent Art.7 (3) GDPR).
In regard to the concept of ‘free consent’, and especially in relation to consent given as a contractual term for the provision of a service, account will be taken as to whether that provision is conditional on consent to processing of personal data that is not necessary for the performance of that contract ( Art.7 (4) GDPR).

Recital 32 is most instructive on the consent concept and relates more directly to consent in the cookie sphere. It emphasises consent ‘by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her…’ Common examples are written statements including by electronic means or oral statements. (Note: controller needs to be careful with oral statements as he/she needs to demonstrate the data subject has consented)
Other examples could include ticking a box when visiting an internet website, choosing technical settings for information society services or another statement or conduct which clearly indicates in this context the data subject’s acceptance of the proposed processing of his or her personal data. Silence, pre-ticked boxes or inactivity should not therefore constitute consent. Consent should cover processing activities for the same purpose or purposes. If multiple purposes are involved then consent has to be given for all purposes .

Of most relevance in the cookie consent context is that any consent given following a request by electronic means must be clear, concise and not unnecessarily disruptive to the use of the service for which it is provided. (cookie walls)  The standard of consent that controllers must obtain from users or subscribers for the use of cookies must now be read in light of the GDPR standard of consent: i.e. it must be obtained by means of a clear, affirmative act and be freely given, specific, informed and unambiguous. ( See DPC cookie sweep report here)

Planet49 case
In this case, a lottery website had required users to consent to the storage of cookies in exchange for access to play a promotional game. The Court (CJEU) decided that the consent which a website user must give to the storage of and access to cookies on his or her equipment is not validly constituted by way of a prechecked checkbox which that user must deselect to refuse his or her consent.
That decision is unaffected by whether or not the information stored or accessed on the user’s equipment is personal data. EU law aims to protect the user from any interference with his or her private life, in particular, from the risk that hidden identifiers and other similar devices enter those users’ terminal equipment without their knowledge.


The Court notes that consent must be specific. Merely  selecting the button to participate in a promotional lottery is not sufficient for it to be concluded that the user validly gave his or her consent to the storage of cookies.
Furthermore, according to the Court, the information that the service provider must give to a user includes the duration of the operation of cookies and whether or not third parties may have access to those cookies.
So, to sum up, pre-checked consent boxes (or cookie banners that tell you a cookie has already been dropped and pointlessly invite you to click ‘ok’) aren’t valid under EU law.
Furthermore, cookie consent can’t be bundled with another purpose (in the Planet49 case the promotional lottery) — at least if that fuzzy signal is being used to stand for consent.
There’s also an interesting new requirement which looks set to shrink the ability of service operators to obfuscate how persistently they’re tracking Internet users.
For consent to cookies to be legally valid,  the court now says the user must be provided with some specific information on the tracking, namely: How long the cookie will operate, and with whom their data will be shared.

The proposed new ePrivacy Regulation.

Earlier we mentioned that the debate over the new Regulation is still continuing. Remember!  A regulation is a  more powerful legislative instrument for EU lawmakers as it’s binding across all EU Member States. It immediately comes into  legal force on a pre-set date , without any need to be transposed into national laws. A regulation is self -executing. On the other hand, Member States have more discretion and flexibility with a directive because it’s up to them as to exactly how they implement the substance of any  directive. For example, they could adapt an existing law or create a new one.

With the regulation all deliberation happens among the EU institutions and, once that discussion and negotiation process has concluded, the agreed text becomes law across the bloc. As a rule,  it is more difficult to get agreement and consensus on a regulation. In the case of the GDPR, some articles specifically made provisions for Member States to be able to vary the substantive application of parts of the regulation .

In Ireland, for example, the age of a child under the DPA 2018 for the purposes of the application of the GDPR is 18years. ( DPA 2018, s.29)  However, GDPR Art. 8 allows Member States to set a lower age in relation to the offering of information society services to a child.  Accordingly, S.31, DPA 2018 sets this at sixteen. Nevertheless,  the GDPR has the final say in  stipulating  that no Member State can lower the age to less than 13years in the same context. ( Art.8 (1) GDPR) Because a regulation is usually so precisely and uniformly applied once enacted,  the debate beforehand may be long technical and arduous. Member States and lobby groups attempt to bring  influence to bear on  the final draft. It will be too late once the final draft is agreed and it becomes law. It is hardly surprising then that the debate on the ePrivacy regulation has taken so long.

There are many contested issues and what view one has is predicated upon individual interests.  Media ,and the publishing industry associations in general, remain entrenched in opposition to the new ePrivacy regulation. Their fears centre on the potential for the regulation to wreak financial havoc on their ad- supportd business models. Such models rely heavily on cookies and tracking technologies , strategically  utilising them to try to monetise free content via targeted ads.  Empowering ordinary  ‘users’  to opt in to being tracked represents a step too far!

Anything that could impede adtech’s ability to track and thus behaviourally target ads at web users is clearly enemy number one, given the current modus operandi. So ePrivacy is a major lobbying target for the likes of the IAB who don’t want it to upend their existing business models.

Key content of the ePrivacy Regulation

The ePrivacy Regulation regulates the use of electronic communications services within the European Union and is intended to replace the Directive on Privacy and Electronic Communications (Directive 2002/58/EC). The ePrivacy Regulation is primarily aimed at companies operating in the digital economy and specifies additional requirements they need to meet in relation to the processing of personal data.

Originally, the ePrivacy Regulation was intended to apply from 25 May 2018 together with the General Data Protection Regulation (GDPR). Unlike with the GDPR, however, the EU Member States have not yet been able to agree on the draft legislation. The negotiations of the ePrivacy Regulation are still ongoing now in 2021.

In view of the fact that there are some points of contention regarding the current text of the Regulation, however, these may not progress as quickly as the Portuguese presidency has recently been pushing forward ePrivacy. The ePrivacy Regulation is certainly not expected to enter into force before 2023. A potential transitional period of 24 months means that any new regulations would then not come into effect before 2025.

Techcrunch has outlined some sources of disagreements.

“There are many contested issues, depending on the interests of the group you’re talking to. Media and publishing industry associations are terrified about what they say ePrivacy could do to their ad-supported business models, given their reliance on cookies and tracking technologies to try to monetize free content via targeted ads — and so claim it could destroy journalism as we know it if consumers need to opt-in to being tracked.

The ad industry is also of course screaming about ePrivacy as if its hair’s on fire. Big tech included, though it has generally preferred to lobby via proxies on this issue.

Anything that could impede adtech’s ability to track and thus behaviourally target ads at web users is clearly enemy number one, given the current modus operandi. So ePrivacy is a major lobbying target for the likes of the IAB who don’t want it to upend their existing business models.

Even telcos aren’t happy, despite the potential of the regulation to even the playing field somewhat with tech giants — suggesting they will end up with double the regulatory burden, as well as moaning it will make it harder for them to make the necessary investments to roll out 5G networks.

Plus, as I say, there also seems to be some efforts to try to use ePrivacy as a vector to attack and weaken GDPR itself.”

Google announces suspension of third party cookies.

The way we use cookies could change dramatically with Google’s announcement that it will phase out 3rd party cookies on Chrome browser by 2022.

A Google blog post announcing the phaseout explains, “Users are demanding greater privacy–including transparency, choice, and control over how their data is used–and it’s clear the web ecosystem needs to evolve to meet these increasing demands.” What is also clear is that users are becoming increasingly frustrated with cookie banners designed to pressure them into  accepting  ad tracking cookies. Firefox and Safari have already phased out the third party cookie but Google has decided to wait until 2022. It has done this on the basis of wishing to work with advertisers ‘to ensure this pivot does not destroy the online advertising business’.

Google also takes the view that ”by undermining the business model of many ad-supported websites, blunt approaches to cookies encourage the use of opaque techniques such as fingerprinting (an invasive workaround to replace cookies), which can actually reduce user privacy and control. We believe that we as a community can, and must, do better.” This is also something the new ePrivacy Regulation should address.

Patrick Rowland, GDPRXpert.ie.

We are GDPR and Data Protection consultants based in Carlow/Kilkenny and Mayo.

Visit www.gdprxpert.ie for more details and any queries.

The GDPR and the DPC 2018-2020

A recently published report shines light on the GDPR and the work of the  DPC 2018-2020.  In a previous blog we looked at the GDPR and the work of the DPC a year into the operation of the regulation.  Following that, we discussed in another blog how, although the early awareness of the new regulation had waned, GDPR had not gone away. Now it has been two years since the regulation came into effect and time to re-examine some aspects of this novel regulation. Unfortunately the Covid-19 pandemic continues. We will return to data protection issues within this context in an upcoming blog post, and a focus will be some  recent developments and updates  from the DPC in relation to transfer of data to the US.

The DPC has released a report which sheds light on trends and patterns that are emerging since the introduction of the GDPR. “Given its role as Lead Supervisory Authority to the various multinational organisations that are headquartered here, much attention is naturally given to Ireland’s regulatory activities in the realm of ‘big tech.”(DPC, 2018-2020 report at p.5). So, what has been the focus of the work of the DPC under the new regulatory regime? Have there been issues that have predominated?

“Though the same themes frequently re-occur – access issues, for example, being a consistent area of contention – there are nuances within each case that impact greatly on timescales and the resolution process”. The same is true of breach notifications, which the DPC also receives in consistently high numbers month-on-month.  In the two years since the GDPR came into effect, the DPC has received almost 12,500 breach notifications, of which 93% were found to be in scope of the GDPR. The DPC has processed and closed out almost 95% of these breach notifications. Despite the high volumes, the cases that have been assessed give no indication that organisations are over reporting. Rather, they suggest that many of the breaches that the DPC examines could have been prevented by more stringent technical and organisational measures at source…

It is important to bear in mind that “The DPC’s remit is not limited to regulation of the GDPR. It encompasses all data protection legislation currently in force in Ireland, which includes a significant but declining volume of legacy work falling under the 1988 and 2003 Data Protection Acts”. The rate of old “act cases” that come before the DPC is diminishing, relative to the rates that were seen in May 2018, and the expectation is that this natural decline will continue in accordance with the passage of time.

This DPC report is intended to assess the range of regulatory tasks of the Data Protection Commission for the period 25 May 2018 to 25 May 2020. It is distinguishable from the Commission’s Annual Reports in that it does not focus on the administration of the office. The report takes stock of the DPC’s experience of its mandated functions under the GDPR; its legal activities and the allocation of its resources in support of Article 57.1 (b)(d). To note, while the report refers in shorthand to “the GDPR”, it is in fact intended to cover the substantive roles of the DPC under the three main pieces of data protection legislation – the GDPR, the e-Privacy Directive and the Law Enforcement Directive as transposed in the Data Protection Act 2018. Since 25 May 2018, the most frequent GDPR topics for queries and complaints have consistently been: Access Requests; Fair processing; Disclosure; Right to be Forgotten (delisting and/or removal requests); Direct marketing and Data Security.

  • Total breach notifications received between 25 May 2018 and 25 May 2020: 12,437.
  • 93% classified as relating to GDPR (11,567 notifications).
  • Of the 12,437 total recorded breach cases, 94.88% concluded (11,800 cases). The most frequent cause of breaches reported to the DPC is unauthorised disclosure (80%).

The purpose of this two-year assessment is to provide a wider-angled lens through which to assess the work of the DPC since the implementation of the General Data Protection Regulation; in particular, to examine wider datasets and annual trends to see what patterns can be identified.

While the DPC – as is the case for many other stakeholders – could already make some observations about aspects of the GDPR and the one-stop-shop procedures that work less well, the purpose of the document is not to offer a critique at this juncture but rather to showcase what has – and is – being delivered.

  • Total breach notifications received between 25 May 2018 and 25 May 2020: 12,437.
  • 93% classified as relating to GDPR (11,567 notifications).
  • Of the 12,437 total recorded breach cases, 94.88% concluded (11,800 cases). The most frequent cause of breaches reported to the DPC is unauthorised disclosure (80%)

Regulating.

Since May 2018, the DPC has opened 24 cross-border inquiries and 53 national inquiries. In May 2020 the DPC issued its first fines under the GDPR, levying two separate fines against an Irish state agency. Also in May 2020, the DPC issued a reprimand to the agency and ordered it to bring its processing into compliance. In the same month, the DPC sent its first major-scale Article 60 Draft Decision to the EDPB. The DPC has concluded nine litigation cases since GDPR came into effect. Through Supervision action, the DPC has brought about the postponement or revision of six planned big tech projects with implications for the rights and freedoms of individuals.

Enforcing

  • An Garda Síochana – reprimand and corrective powers applied in accordance with the Data Protection Act, 2018.
  • Tusla; The Child and Family Agency – reprimand and fine applied in accordance with the Data Protection Act, 2018.
  • Tusla; The Child and Family Agency – reprimand and fine applied in accordance with the Data Protection Act, 2018.
  • Twitter – Inquiry completed and draft decision forwarded to EU concerned data protection authorities in accordance with Article 60 of the GDPR.
  • DEASP – Enforcement notice issued regarding the use of the Public Services Card (currently under appeal).
  • 59 Section 10 decisions issued.
  • 15,000 breach notifications assessed and concluded.
  • · 9 litigation cases concluded in the Irish Courts.
  • Hearing in CJEU Standard Contractual Clauses case brought by DPC to Irish High Court.
  • 80% of cases received under the GDPR have been concluded.

Mainstreaming Data Protection

Staff of the DPC have presented at over 330 stakeholder events since May 25 2018. Since the Coronavirus restrictions have been in effect, the DPC has continued to support stakeholder events through online participation. The DPC has committed to driving awareness of data protection rights and responsibilities, including over 40 guidance notes covering technological advice, GDPR compliance and direct marketing/electoral constraints.

Other Activity

Since May 2018:

  • The DPC has opened 282 new direct marketing complaints and concluded 247.
  • There have been  11 successful  prosecutions against companies  for a combination of 42 offences under S.I. No. 336/2011.
  • The office handled 66 Law Enforcement Directive complaints.
  • The DPC has successfully completed the EDPB consistency opinion process for both Code of Conduct monitoring bodies and for the additional requirements for INAB.
  • A Data Protection Officer Network has been established.
  • Irish DPC has partnered with the Croatian Data Protection Authority and Vrije University on an EU funded project specifically targeting SMEs.

Most Frequently Queried GDPR Topics

Since 25 May 2018, the most frequently raised GDPR topics for queries and complaints have consistently been:

  • Queries relating specifically to Access Requests;
  • General queries (unclassified) ;
  • Fair processing (including fair obtaining and further processing);
  • Disclosure (data shared with a third party);
  • Right to be Forgotten (delisting and/or removal requests);
  • Direct marketing; and
  • Data Security.

The single most cited data protection issue was access request with a total of 3,398 or 22.62% of all cases. Not far behind lies the field of ‘general query’ which accounted for 3245 cases or 22% of the total. Issues of fair processing and disclosure followed, with 15% and 12% of the total respectively.

Breaches

 Any organisation or body which makes use of personal data as part of its business – regardless of whether the data pertains to customers or staff – is deemed to be a data controller and ultimately accountable for the safeguarding of the personal information in its possession. Article 33 of the GDPR introduced several obligatory actions for data controllers, including mandatory notification of breaches to the appropriate data protection authority within 72 hours. In the two years since the introduction of this provision, the DPC has seen an exponential increase in the breaches being notified to it.

  • Total breach notifications received between 25 May 2018 and 25 May 2020: 12,437
  • 93% have been classified as relating to GDPR (11,567 notifications).
  • Of the 12,437 total recorded breach cases, 94.88% have been concluded (11,800 cases) and 5.12% are currently active (637 cases).

With the exception of a seasonal decline in December 2018, the number of breaches being reported to the DPC remained broadly consistent over the first 18 months of GDPR implementation. Q2 of 2020 shows an overall trend towards reduced breach notifications. It is not possible to attribute this decline to a particular cause, though it is likely that the number of breach notifications has been impacted by the Coronavirus crisis.

We saw earlier that by far the most frequent cause of breaches reported to the DPC is unauthorised disclosure (80%); whether by digital, verbal or other manual means. Manual processing – and consequently an inferred lack of robust processing procedures – is at the root of far more reported breaches than phishing, hacking or lost devices (5.6% collectively). As with the trends observed earlier in the queries and complaints that the DPC receives, the patterns within the recorded breach notifications indicate that there is also a significant volume of work that falls to the DPC, which could be mitigated by more robust technical and organisational measures being introduced by the data controller and the processes for testing, assessing and evaluating these measures being overseen by the data protection officer going forward.

At present, the DPC workload in the breach area is heavily influenced by the need to engage with organisations to address elementary processing liabilities, which are occurring at a very basic level. As we move forward in time, the DPC expects to see changed behaviours amongst its regulated entities, resulting in a reduction in the volume of breach notifications that can be attributed to a lack of due care and attention. Some examples are instructive.

Insufficient organisational and technical measures in place to secure data.

An organisation responsible for providing care to both children and adults with a range of support requirements notified the DPC of a breach in which it outlined that a wheelie bin containing the personal data of residents and staff of the facility had been removed from their premises and discarded on a neighbouring property. The individual who discovered the contents of the wheelie bin fly tipped on their property contacted the organisation after first inspecting the records to establish their origin. Following contact from the individual, the organisation arranged to retrieve the records and disposed of them in an appropriate manner. Based on the information provided by the organisation, the DPC raised a number of queries focusing on whether the organisation had policies and procedures for confidential disposal, and whether they were in place at the time of the incident.

The organisation advised that it did not have a specific confidential disposal policy in place; however, it did advise that the premises had shredding facilities in place to assist with the confidential disposal of records. On this occasion, these facilities were not utilised. The DPC highlighted that – as a data controller – it was the organisation’s responsibility to ensure that both appropriate organisational and technical measures are employed to ensure that the processing of personal data is done in a secure manner. The DPC also highlighted that the processing of personal data also encompasses both its erasure and/or destruction. The DPC recommended that the data controller undertake the following actions:

  • Complete a GDPR self-assessment to identify areas where immediate remedial actions are required in order to ensure compliance with GDPR.
  • Review their obligations as a data controller, in particular their obligation centering on the security of data.
  • Undertake an exercise to produce adequate policies and procedures in relation to the appropriate disposal of personal/sensitive records both in hard and soft copy. Based on the recommendations of the DPC the data controller has initiated a data protection compliance project to address the areas highlighted. The data controller committed to providing the DPC with updates in relation to the progress of this project and making available the necessary evidence of actions undertaken based on the recommendations provided. This is being monitored on an ongoing basis.

Data Processor Accounts Compromised

In October 2019, the DPC was notified by an Irish public sector body of a personal data breach, which had occurred as a result of a compromised email account which was being used by a data processor. This exposed the public sector body to the liability that personal data – including data subjects’ names, addresses, dates of birth, details of family relationships and biometric data – could be accessed by a malicious third-party while being sent to, or held in, the compromised account. The data processor was located outside the European Union and was using a locally hosted email provider.

The DPC engaged with the public body in order to determine what measures it had in place at the time of the breach to ensure that the processor took all precautions required, pursuant to Article 32 of the GDPR (security of processing). The DPC also sought to determine whether the arrangement between the public sector body and the processor was such as to require the processor to assist it in ensuring compliance with data security and personal data breach notification obligations, and to make available to the controller all information necessary to demonstrate compliance with data security obligations, as required by Article 28 of the GDPR. Following extensive engagement between the DPC and the public sector body in question, the DPC issued specific recommendations to the entity, including recommendations for technical measures to be implemented by third-party processors engaged by the public sector body.

In response to these recommendations, the public sector body informed the DPC that it is providing secure email addresses to relevant processors to replace locally hosted email accounts and is revising its conditions for the engagement of data processors, including specific requirements on data security and training. They have also provided the DPC with regular updates on the implementation of the DPC’s recommendations, including providing copies of relevant documentation. The DPC continues to engage on a regular basis with the relevant public sector body in order to monitor its implementation of these recommendations.

Unsecured Data Storage

In November 2019 the DPC, largely through media reports, was made aware of a potential data breach occurring in an Irish university. This potential breach could be traced to a manner in which large amounts of personal data, including payroll, bank details and PPS numbers, were in a location facilitating easy access by a very large number of people. The university was made aware of its obligations under Art.33 GDPR and following this quickly a breach to the DPC.

The DPC engaged with the university to determine who had access to the data, the level of supervision of those who had access, the nature and sensitivity of the data and finally what the university had done to respond to the breach. To prevent a repetition and to ensure that data was processed appropriately in the future, the DPC made some specific recommendations. In particular the DPC advised the data controller:

  • To review the level of physical security applied in respect of the personal data security facilities;
  • To ensure adequate access controls are put in place and, in particular, access to personal data being placed on a strict ‘need to know’ basis, with extra special care given depending on the nature and sensitivity of personal data;
  • To review its data retention policies taking care not to collect or retain unnecessary data and at the same time ensuring that the controller can record and track any archived data;
  • To provide regular and up to date training on data security.

The data controller took heed of all the recommendations and continues to engage with the DPC on a periodic basis.( at p.28 DPC report)

The point evidenced by the foregoing examples is that the DPC has been busy since the introduction of the GDPR but it seems only the higher profile investigations attract attention. Behind the scenes, and away from public view, many investigations commence, and most are brought to an appropriate conclusion in a timely manner. Results vary and not all investigations lead to sanctions.  Many case examples above display a willingness by the DPC to take a proportionate response.

Patrick Rowland, GDPRXpert.ie.

We are GDPR and Data Protection consultants with bases in Carlow/Kilkenny and Mayo, offering a nationwide service.

For more details visit www.gdprxpert.ie

Covid -19 pandemic creates difficulties for many.

The Covid-19 pandemic has created difficulties for many, especially employees and employers. Many business owners have not been able to continue paying their employees. This has resulted in the laying off of many employees. For employees, apart from anxiety over their own health and despite mortgage moratoriums et cetera, this has created financial difficulties. For employers, and especially SMEs the pandemic has the potential to deal a death blow to a business that took years to build up.

As noted in a previous blog, when set in this backdrop, data protection concerns seem trivial. Nevertheless, just as fundamental rights and freedoms cannot be trampled on in a health crisis, neither can data protection rights. Indeed, because more sensitive categories of personal data are now being processed (health data, particularly)), more care should be taken to ensure that data protection rights under the GDPR are being respected and enforced.

There must be at least one legal basis to process data and all principles must be abided by.  What is often forgotten is that even where derogations from the GDPR apply, the principles must still be respected and applied in any personal data processing operation.  Covid-19 pandemic has created difficulties for many. While the rules should be obeyed even under extreme circumstances, these same data protection rules (such as the GDPR) do not hinder measures taken in the fight against the Coronavirus pandemic.  It is conceivable that in times of emergency such as now, some data protection rules may be relaxed but it is unlikely they will ever be suspended or waived. Still, there have been many questions to GDPRXpert from clients unsure of aspects of GDPR, especially in the specific context of this pandemic. This  Covid-19 pandemic creates difficulties for many.  At this time, we will take a look at some of the most common questions we have been asked.

Question 1.

I have many of my employees working from home at least temporarily. Are there any special precautions employers need to take in relation to personal data?

Answer.

Many people work from home, but clearly these numbers have increased since the pandemic. The first thing that those working from home must do from the outset is create the mindset that they are still working in the office. Remember, it is not feasible for employers to go and assess the suitability, or otherwise, of all ‘work from home locations’ (WFHL),so some basic and normal ground rules need to be emphasised.

Employees must secure their data just as if they would in the office. To do this they must take the normal precautions and act as if  present  at their place of employment.  It is paramount they don’t allow family members, or anyone else, to just walk in to where they have set themselves up. For example, they should never leave personal data on view on a computer screen. Data protection consultants GDPRXpert frequently remind a client that is often the small oversight or lack of attention that leads to data being compromised. Employees should log off when leaving their work station or lock an area if too many people are coming and going. Working from a laptop on a couch is not a good idea if  sharing an apartment or house with others! There should be strict controls on the ability to download personal data from an organisation’s system files.

If no relevant data protection policies are in place, now is the opportune time to enact some to govern how company assets and information can be accessed, where information can be stored, and how information can be transmitted. Employees must be quickly made aware of, and become competent about, the types of information considered to be confidential, trade secret, or otherwise protected. There is much anecdotal evidence of an upsurge in phishing attacks.

In the US there has been a huge rise in fraud schemes related to Covid-19, with many businesses receiving fake e-mails purportedly from the Centre for Disease Control (CDC). These emails contain malicious attachments so employees at WFHL need to be extra vigilant. In all cases these fraudsters are attempting to have their targets access and verify personal information or credentials. Employers must train their employees on how to detect and handle such scams and keep them informed about the latest threats. It is a good idea to have regular video conferencing with staff to facilitate Q&A sessions and update everyone on the latest threats. It also helps staff morale.

Only those whose essential job duties place them in the ‘need to know’ employee classification should have access to ‘special category data’, which includes health data. It is best practice to carefully review any Bring Your Own Device (BYOD) agreements, if any are in place between you and employees. In this scenario, and where special category data are being processed, it is vital that all information is encrypted in transit and while at rest on the device.  For example, many in the healthcare field are now working remotely and collecting health data. In the absence of special arrangements these remote employees should be utilizing company-issued equipment and not saving company data to personal laptops, flash drives, or personal cloud storage services such as Google Drive.

It is true to say that the risks for the employer are numerous, so all care should be taken in relation to BYOD agreements. Any employer should seek to ensure that those practices do not compromise the security of, and your right of access to, your information and data, and that your policies comply with all attendant legal obligations.

In the conventional office working setting it is easy to have a quick word in an employee’s ear if an employer becomes aware of any breach of, or indiscretion concerning, a BYOD agreement. It is more complicated when employees are working remotely. Best and safe practice is  for employers to  consider periodic reminders of the BYOD policy and offer training sessions, as well as ongoing education regarding the importance of protecting the employer’s trade secrets, confidential and proprietary information and data. There should be strict controls on the ability to download personal data from an organisation’s system files.

 

“There is no questioning the advantages of BYOD agreements. It is a growing trend, one that may already be occurring at your company. Employers are implementing policies and practices that permit, or even require, their employees to use their personal electronic devices (e.g., laptops and smart phones) and data services (e.g., backup and file-sharing software) for work-related purposes.  The appeal of such Bring-Your-Own-Device (BYOD) practices for both employers and employees is undeniable. Employers avoid the up-front costs and administrative hassle of purchasing laptops and smart phones as well as employees’ demands for the latest and greatest gadgets, and employees do not have to carry around multiple devices. Overall, this is a much simpler and more efficient way of doing business, right?”(Elaine Harwell, Senior Counsel, Procopio).  There are security considerations nevertheless, and here are some aspects that demand careful attention.

 

Your BYOD policy should cover a broad range of topics, including:

  • Which employees are permitted to use personal devices for work purposes;
  • Acceptable and unacceptable use of personal devices for work purposes;
  • Your ownership of and right of access to all employer data on employees’ personal devices and employees’ lack of privacy rights in that data;
  • Your security and data protection protocols;
  • Your employees’ obligations with respect to maintaining the security of employer data (e.g., a provision requiring employees to protect all devices that contain employer data with a password or PIN);
  • A disclaimer that the employer is not responsible for the security of the employee’s personal data;
  • Reimbursement for the employee’s use of his or her personal devices; and
  • Rules and/or restrictions regarding work-related use of personal devices outside of working hours.

Question 2.

Can an employer let employees know the identity of a co-worker who has contracted Covid19?

 

Answer.

We know that personal data includes an identifier such as a name.  Processing includes inter alia, “…disclosure by transmission, dissemination or otherwise making available…” Therefore, sharing the name of an employee who has contracted Covid-19 constitutes personal data processing. ‘Data concerning health’ under Art.4 GDPR includes any personal data related to the physical or mental health of a natural person …which reveal information about his/her health status. In this instance we have an employee’s name, which is ‘ordinary’ personal data, and data concerning health, which falls under ‘special category data’ under Art.9 GDPR. Processing rules vary depending on the categorisation of the data involved. The legal bases for processing also differ, again depending on the category of the data.

In line with the confidentiality principle, the general rule is that the identity of an affected employee should not be disclosed to his/her colleagues or any other third parties without some legal basis or very strong justification. Having been informed by previous experiences we know that the smaller the business is, the more easily the identity of the co-worker will become known. Even in larger companies a person’s absence will be noticed and lead to unhelpful speculation, much of it on social media, as to who exactly has the virus.  This speculation would be upsetting for those wrongly identified as having Covid 19. It is usually not necessary, and often will not serve a legitimate purpose to disclose the identity of an employee with Covid 19. Employers are under a legal obligation to ensure the health and safety of employees Safety, Health and Welfare at Work Act 2005 ). Informing employees of an infectious disease in the workplace would be a statutory duty (also a common law duty with an attached duty of care). Indeed, employers should carry out a risk assessment to identify the risks of a coronavirus outbreak at work, and implement steps to minimise that risk. That said, (even in the absence of obligations under  health and safety legislation) it would be expected that employees would be informed of any case of Covid 19 in a work setting in order that staff could self isolate or work from home.

Any information disclosed should always be limited to the minimum necessary for a specific purpose. Someone’s identity, normally and generally, should be disclosed only where absolutely necessary and on a strict need to know basis. As evident from a notice by the DPC the key word may be ‘generally’. “Any data that is processed must be treated in a confidential manner i.e. any communications to staff about the possible presence of coronavirus in the workplace should not generally identify any individual employees.”  The DPC also states that “the identity of affected individuals should not be disclosed to any third parties or to their colleagues without a clear justification.”We note it does not state ‘without a clear legal basis under GDPR’. There is a world of difference between the two.  Any test of what is ‘clear justification’ either does not exist, or is a subjective test. Who decides what a ‘clear justification’ is? Does a justification have to be set within a legal basis?  The ultimate arbiter on this is the CJEU.  It is a facile exercise to set out a justification for an action, rather than ground it on a legal basis.

 

 

From a practical perspective, to allay fears amongst all employees who are wondering how close their contact was with the infected employee, a common sense approach would be to ascertain whether the infected employee would consent to his identity being made known to his/her co-workers, with the aim of more effectively safeguarding those co-workers. For example, if a worker in a very large manufacturing plant became infected it would cause undue stress to many employees if no other information was forthcoming from the employer. Employees will worry and wonder about how close they were to the infected individual. If an employer is too specific about the area of the plant where the infected employee worked, it may be tantamount to naming the individual. The circumstances and details of any particular case will determine the nature and quality of the dilemma facing the employer.

 

There is no avoiding the reality that not knowing who exactly in your place of employment has contracted Covid-19 will cause undue stress on that person’s co-workers.  As noted many times, data protection rights under the GDPR, and data protection and privacy rights under the Charter and the European Convention on Human Rights respectively, involve a balancing exercise with other rights. In cases like the present one, the unprecedented circumstances involved in the whole scenario suggest to us that a common-sense approach is an option that many will consider.  It is an approach that carries some risk. In normal circumstances a person’s identity should not be disclosed, but in very extreme situations, such as the present one, a justifiable case could be made for releasing a person’s identity.

This action is still fraught with danger, and if an employee files a complaint it will be up to the DPC at first instance to give a decision. An employer’s justification in releasing the identity of the coronavirus victim may not withstand scrutiny by the DPC.  The best advice is not to release a person’s identity unless you have obtained explicit written consent from the employee. Where explicit consent is not forthcoming our advice would be to state that a co-worker, who cannot be named at this time, has contracted covid-19. How much more information is conveyed to co-workers is dependent upon the particular, and possibly unique, circumstances of an individual situation.

There will be cases   where, for example, an employer will conclude that the health and safety of all employees is best served by disclosing the identity of the employee with Covid-19. In such a situation, and because of the statutory duty on the employer by virtue of health and safety, there is at least an arguable case. Remember, although set in a different work context, ‘the indications of impending harm to health arising from stress at work must colleagues may be infected, but they should only reveal their names if national law allows it; if they can justify that such a step is necessary: and only after the affected workers have been be plain enough for any reasonable employer to realise he/she should do something about it’. (Hatton v Sutherland [2002] 2 All E.R. 1)

Ultimately, the roadblock may be formed by the twin concepts of ‘necessity’ and ‘proportionality’ that permeate through the GDPR and EU law.Views on the issue are by no means unanimous across the EU. A most recent guidance note from the European Data Protection Board says ‘employers should inform staff that colleagues may be infected but they should only reveal their names if national law allows it if they can justify that such a step is necessary; and only after the affected workers have been informed/consulted beforehand.’ Earlier we saw the slightly differing view from the DPC guidance. The U.K. ICO also takes a slightly different view. “You should keep staff informed about cases in your organisation. Remember, you probably don’t need to name individuals  and you shouldn’t provide more information than necessary. You have an obligation to ensure the health and safety of your employees as well as a duty of care. Data protection doesn’t prevent you doing this.” The identity of affected individuals must not be disclosed to their colleagues or third parties without a clear justification.

The Appropriate Lawful Bases.

The HSE and other public health authorities would be seeking details concerning any Covid- 19 case in any context. Certain information is always needed so that authorities can effectively carry out their functions. Only recently Covid-19 was declared a ‘notifiable’ infectious disease under recent legislation. Medical doctors are mandated to report cases to the Medical Officer under the Infectious Diseases (Amendment) Regulations 2020. There is no equivalent legislation covering employers. Strangely, employers are not mandated to report infectious diseases to the Health and Safety Authority. Employees under the 2005 ct are mandated to report to their employer or the employer’s nominated registered medical practitioner if they become aware of any disease which affects their performance of work activities that could give rise to risks for the health , safety, and welfare of others at work. A clear duty is imposed on all employees to protect themselves and others. However, employers under the 2005 Act are under a legal obligation to protect employees from issues that affect their health and safety, in a negative manner. Clearly, this could easily be construed to include the novel coronavirus. This could act as a lawful basis for processing personal data.

Processing could also be justified on the basis of Art.6 (1(d) that it is ‘necessary to protect the vital interests of the individual data subject (employee) or other persons (other employees or other people). An employer could also find a legal basis for processing the personal data under Art.6 (1) (f) GDPR where “processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party…”  Where an employer relies on this legal basis, he/she should document the ‘legitimate interests assessment’ that has been made.

In certain cases the person’s identity will be needed. For example, authorities may need to interview the employee who has contracted the disease. Recital 46 GDPR states “some types of processing may serve both important grounds of public interest (lawful under Art.6 (1) (e) ) and the vital interests of the data subject (Art.6(1)(d)), as for instance where processing is necessary for humanitarian purposes, including for monitoring epidemics and their spread…”  Where the employer shares information, the sharing should be in compliance with GDPR and, most especially, the principles. In many cases employees themselves may fully consent to having their identities made known or they will make it known themselves.  If so, in those cases the personal data will have been ‘manifestly made public’.

It is questionable whether the consent of an employee to processing of his /her own personal data would constitute valid consent. It has not been definitively set out in the context of the employer/employee relationship but Recital 43 makes it clear consent is not a valid legal ground “where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority…” GDPRXpert has not found any case law to support the view that an employer/employee relationship would satisfy the ‘clear imbalance test’. Undoubtedly, the average employee could feel pressurised into giving consent.  It is something that will fall for future decision on a case by case basis. What is noteworthy is that the reference to a clear imbalance in the context of an employment relationship, which had been included in an earlier draft of the GDPR, was deleted in the enacted regulation.

Health Data Processing

Where data concerning health are involved, the situation changes. As we know there is a general prohibition on the processing of ‘special category’ data, which includes data concerning health. There are a number of exceptions to this broad prohibition, including under Art.9 (2) GDPR and sections of the DPA 2018. These provide potential legal bases for processing health data for the purposes of Covid-19 containment. S.46 DPA 2018 and Art.9(2)(b) permit the processing of health data where necessary and proportionate for the purposes of exercising or performing any right or obligation under Irish employment law – employers are legally obliged to ensure the safety, health and welfare at work of their employees. Specific measures to safeguard the fundamental rights and interests of the data subject (employee) must be taken.

Perhaps the most appropriate legal basis for processing health data is found under Art.9(2)(i) GDPR and s.53 DPA 2018, both of which provide exceptions to the general rule. Here the processing is deemed necessary for reasons of public interest in the area of public health such as protecting against cross border threats to health. Both must be underpinned by law (EU/Member State) providing suitable and specific measures to safeguard rights and freedoms of the data subject (employee). Examples of suitable safeguards would be limitation on access to the data, strict time limits for erasure, and other measures such as adequate staff training to protect the data protection rights of individuals.

S.52 DPA 2018 and Art.9(2)(h) GDPR also offer a sound legal basis as both provide, inter alia, for processing for the purposes of preventative or occupational medicine, and for assessment of the working capacity of an employee. Necessity and proportionality are always underlying considerations.

Question 3.

Can employers ask for travel and medical information from employees and from visitors coming to the workplaces of employers?

Answer.

Employers as we noted earlier are under a legal obligation to protect the health of their employees and to maintain a safe place of work. (Safety, Health and Welfare Act, 2005). There would be justification for employers asking employees and visitors about recent travel, in their efforts to prevent or contain the spread of Covid-19 in the workplace. This would be especially so, where they are worried about any possible travel to Covid-19 hotspots. Employers have a legal obligation to protect the health of their employees and maintain a safe place of work. In this regard, employers would be justified in asking employees and visitors to inform them if they have visited an affected area and/or are experiencing symptoms. If travel has taken place as part of an employee’s duties then those details are known already to an employer. The question then becomes one of asking about personal travel destinations and the presence of any Covid-19 symptoms.

In Ireland the DPC has given recommendations on Covid-19 and these support the view that it is reasonable to ask an employee such questions. Implementation of more stringent requirements, such as a questionnaire, would have to have a strong justification based on necessity and proportionality and on an assessment of risk. It is advisable to be sensible when asking employees to provide personal information about their likelihood of risk and not to ask for more than you genuinely need.

Out of the 28 national data protection authorities of European Union member states, some 20 EU countries have issued specific guidance regarding COVID-19 and data protection so far. We are beginning to see several core principles emerge from this guidance:

  1. COVID-19 sensitive personal data, such as medical symptoms and diagnosis, travel history, and contacts with those who have been diagnosed can be processed on the basis of safeguarding public health.
  2. The fact that an employee has tested positive for COVID-19 can be disclosed, but identifying information about the individual, in particular the individual’s name, should not be disclosed.
  3. European DPAs have scrutinized if not discouraged or prohibited mass surveillance techniques by data controllers, such as use of questionnaires or temperature checks, other than those performed by health authorities.
  4. Security measures must still be implemented to protect COVID-19 personal data.

What the foregoing has shown is that some issues around data protection in the context of the Covid-19 pandemic are complicated. The coronavirus pandemic has brought forth evidence of how interpretations of some articles in the GDPR vary within jurisdictions. Member states (MS) have been given some latitude in making changes and additions to the GDPR, but Covid-19 has exposed a lack of consistency in interpretation of portions of the GDPR across the EU. This is something we will look at closely in the future, and as the pandemic expands in a potentially lethal manner globally.

Patrick Rowland, GDPRXpert.ie.

We are GDPR and Data Protection consultants with bases in Carlow/Kilkenny and Mayo, offering a nationwide service.

For more details visit www.gdprxpert.ie

The ongoing Covid-19 raises life and death questions.

The ongoing Covid-19 pandemic raises life and death questions all over the globe. Data protection concerns in this context appear trivial and insignificant. As the DPC has stated, “data protection law does not stand in the way of the provision of healthcare and the management of public health issues; nevertheless there are important considerations which should be taken into account when handling personal data in these contexts, particularly health and other sensitive data”. Nevertheless, some questions raised will remain as contentious issues long after Covid-19 has been clinically controlled. There is no need to remind anyone how the pandemic raises life and death questions. People are getting sick and many people are dying .
Identified and Identifiable


There was much debate and controversy surrounding the lack of specific geographic details provided by health officials in relation to confirmed cases of Covid-19. One view was that the GDPR was being cited as a reason not to provide more precise details as this could lead to someone’s identity being disclosed. Remember that from Art.4 (1) “‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name…location data…or to one or more factors specific to the physical…cultural or social identity of that natural person;”.It is well established that non- disclosure of a person’s name, in and of itself, is not always a guarantee of anonymity.
Even when special category data processing is allowed by derogation, it does not mean there is any derogation from the applicability of data protection principles. On the contrary, these are always applicable. In fact, it is especially important to abide by all data protection principles in relation to these more ‘sensitive categories’ of data, to use the term from the old Data Protection Acts. Unquestionably, a person’s identity could quickly become public knowledge if a very precise geographical location was provided by the public health authorities. It would not be long before people would be able, by a process of elimination and observation, (hopefully, not surveillance!) to identify people in a particular area that were in isolation because they had tested positive for the virus or had been in recent contact with someone who had tested positive.
                                Location data


Although the health authorities are doing their best to ensure a person does not become identifiable, the possibility of this happening increases directly as the information disclosed becomes more detailed .Take the case of the un-named school that was closed because students had recently returned from a school trip to Northern Italy. Health authorities consistently refused to name the school, despite the fact it had been immediately identified on social media. Their policy of non-disclosure was rendered meaningless.

The truth is this type of information becomes public knowledge very quickly. In the midst of a pandemic people are understandably more inquisitive, and it is quite possible anyone who was on the school trip to Italy and succumbed to the virus would be identified in a timely manner. What is clear so far is that the health authorities are determined to keep information to a minimum so that precise geographical locations are not revealed. This is why we have been hearing about a case in the South or the East etc. but no towns or cities had, at least initially, been named. Some politicians prefer more specifics on locations of so-called clusters of infection.
Different views or rationales can be taken of this policy approach. One view is that naming the location precisely might, in combination with other information available to local residents, make an individual or individuals readily identifiable. This could cause panic to people in the immediate region and distress to patients and their families. Another view is that if the precise location was given, then residents in proximity to that area might be on higher alert leading to greater caution in their personal social interactions. The policy has been defended on other grounds by Dr. Tony Holohan. It is seen as designed to protect the privacy of individuals on the basis that people are less likely to come forward if they fear their identity will be made known. This would be another hurdle in the race to quantify and track the extent of the pandemic.

For the public interest/of public interest 

All views have their merits but any view carries an underlying interpretation of what is ‘for the public interest’. Undoubtedly, the question is a subjective one, and in instances such as a public health emergency caused by a pandemic, what constitutes the “public interest” is properly evaluated by the health authorities and the government. Within this context, under Art.9 (2)(h), Art.9(2)(i) and S.53 DPA 2018 lie the specific exceptions to the general prohibition on processing of special categories of personal data, which includes health data. Derogations from the general prohibition are allowed, but subject to suitable safeguards having been put in place to safeguard fundamental rights and freedoms of the data subjects. Such safeguards may include limitation on access to the data, strict time limits for erasure, and other measures such as adequate staff training to protect the data protection rights of individuals.

There are many lawful bases for processing personal data. Consent is one of them but it is by no means the strongest. It can be withdrawn at any time. Indeed, the GDPR provides for the legal grounds to enable competent public health authorities (and employers) to process personal data in the context of epidemics, without the need to obtain the consent of the data subject. This applies for instance when the processing of personal data is necessary for the employers for reasons of public interest in the area of public health or to protect vital interests (Art. 6 and 9 of the GDPR) or to comply with another legal obligation.

A valid distinction needs to be made at the outset between what is “for the public interest” and what is “of interest to the public”. I am fairly certain that it was Minister Simon Harris who was recently criticised for making a distinction between the two, but he was correct in his assessment. What he was trying to explain was that because some information is ‘of interest to the public’ does not mean its disclosure is made legitimate or justifiable by a motive of public interest. Would disclosing the information do more harm than good? It has elements of the harm principle of the utilitarian philosophy espoused especially by J.S. Mills and Jeremy Bentham. In essence, the lesser harm for the greater good. The courts and many statutes frequently refer to the public interest but “ there is no single satisfactory definition of what the public interest is”.( See, Kelleher, Privacy and Data Protection Law in Ireland, 2nd ed. at p.175) It might be more incisive to simply ask what is in the best interests of the public at large.

In the context of a Freedom of Information case in an Australian Federal Court, Justice Brian Tamberlin wrote the following:
The public interest is not one homogenous undivided concept. It will often be multi-faceted and the decision-maker will have to consider and evaluate the relative weight of these facets before reaching a final conclusion as to where the public interest resides. This ultimate evaluation of the public interest will involve a determination of what are the relevant facets of the public interest that are competing and the comparative importance that ought to be given to them so that “the public interest” can be ascertained and served. In some circumstances, one or more considerations will be of such overriding significance that they will prevail over all others. In other circumstances, the competing considerations will be more finely balanced so that the outcome is not so clearly predictable. For example, in some contexts, interests such as public health, national security, anti-terrorism, defence or international obligations may be of overriding significance when compared with other considerations.( McKinnon v Secretary, Dept. of Treasury [2005]FCAFC)

The term eludes precise definition but at its core is concern with the welfare or well-being of the general public and society. Data protection law and GDPR have often to be balanced against other rights such as freedom of expression. Today we are seeing with Covid-19 government actions how the public interest motive in the area of public health far outweighs personal rights and freedoms. What is, or indeed what is not, in the public interest often depends on the context in which it is being examined.

Mr Justice Barrett in Dublin Waterworld v National Sports Campus Development Authority [2014] IEHC 518(7 Nov 204) stated, “disputes are likely to be of interest to the public but that does not make their resolution a matter of public interest”. S.53 DPA 2018 uses the terms “for public interest reasons in the area of public health including… ”  The terminology of Art. 9(2)(i) is similar and refers to “processing necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health…”
Processing of special categories of personal data including health data is clearly permissible under GDPR and the DPA 2018. “Processing” under the GDPR includes, amongst others, “dissemination”, but this does not mean it is permissible to freely share the information with the general public. Dissemination, as a form of processing must itself follow the data protection principles and respect, amongst others, the principle of purpose limitation.

If the personal data is initially collected (processed) for the public interest in the area of health, is the dissemination (for example, through the coronavirus daily briefings) in line with this original purpose? It is likely that the answer is yes. The best informed view is that the dissemination just represents another type or form of processing and the purpose remains the same. Anyway, Art. 6(4) GDPR allows for a ‘compatibility (of purpose) test’, in situations where the processing is for a purpose other than that for which the data have been collected and is not based on consent or Union or Member State law. The concept of “public interest” at general law is wide –ranging and expansive. A classic dictum is that is of Lord Hailsham that “the categories of public interest are not closed” (D v National Society for the Prevention of Cruelty to Children [1978] AC 171 at 230)

There are… several different features and facets of interest which form the public interest. On the other hand, in the daily affairs of the community events occur which attract public attention. Such events of interest to the public may or may not be ones which are for the benefit of the community; it follows that such form of interest per se is not a facet of the public interest  (DPP v Smith [1991) 1 VR 63 at 75).

The public interest is not the same as that which may be of interest to the public. We have seen in many previous blogs how data protection rights do not exist in isolation, nor do they trump other rights. At any time the Government can decide to be more forthcoming and more specific with information concerning Covid -19. The deciding factor will be whether it is in the public interest to do so. If that time ever comes the government will still be mindful of the obligation to protect the anonymity of any individual who may have contracted the infection.
In an upcoming blog we will share common data protection concerns in the context of the coronavirus that have been raised by many of our clients through our website.

Patrick Rowland, GDPRXpert.ie
We are GDPR and data protection consultants with bases in Carlow/Kilkenny and Mayo, offering a nationwide service.
For more details visit www.gdprxpert.ie

GDPR Hasn’t Gone Away.

The GDPR hasn’t gone away. In fact, the truth is that it is really just getting started as regulators, not all, become more assured in their own compliance policies and strategies.

In previous blog posts we looked at the first annual report from the DPC since the GDPR was introduced in May 2018. Following on from that, we did an evaluation of the effectiveness of the GDPR which assessed its effectiveness about 15 months past its inception. The GDPR has once again been the subject of debate recently, with this debate emanating from the perspective of enforcement. Most notably, there has been harsh criticism of the Irish DPC because of a perceived reticence to impose fines. Whether this is justified is examined below.

Some Quick Stats

The most up to date information on the application of the GDPR throughout the EU/EEA provides the following:

  • More than 6,700 data breaches were notified to Ireland’s Data Protection Commission (DPC) last year, the second highest level of notifications recorded per capita across Europe.
  • Since its implementation in May 2018, the General Data Protection Regulation (GDPR) led to over 160,000 data breach notifications across Europe, according to research from multinational law firm DLA Piper.
  • From this total of 160,000, there were about 100,000 reported for 2019. A recent report by DLA Piper showed the Netherlands topped the table with 40,647 data breach notifications reported. The same country had a per capita ratio of 147.2 per 100,000.
  • Ireland had a per capita ratio of 132.52 per 100,000, ranking second in the table followed by Denmark.
  • European regulators have imposed €114 million in fines (for data breaches) under the GDPR regime to date, with a further €329 million in sanctions threatened. ( See, ‘Ireland ranked second in Europe for data breach notifications’ )

 

Fines

Of most interest to data protection professionals is the type and amount of fines that have been issued to date. In this context it is enlightening to remember that the Irish DPC is the lead regulator for many companies such as Google, Twitter, Facebook, Microsoft, and others. This is in part due to the ‘one stop shop ‘mechanism introduced under the GDPR. Based on the figures for data breach notifications shown above it would be expected that the Irish DPC would have issued numerous fines at this point in time.

New figures compiled by the Italian data protection body Osservatarorio di Federprivacy – which includes data from official sources in 30 countries – show authorities in the EU/EEA imposed 190 fines in 2020.Italy was the most active data protection authority, with 30 actions last year, while the UK was the most punitive, with fines totalling €312 million, some 76 per cent of all sanctions issued. Among the companies to be facing fines are British Airways and Marriot, which are looking at bills totalling £183 million (€214.8 million) and £99 million respectively after being sanctioned by the UK’s Information Commissioner’s Office last year.

Only Ireland and Italy failed to impose any fines. On its face, a failure to impose fines is disconcerting and raises questions about the practical operation of the GDPR.   One of the partners at DLA Piper who specialises in cyber security and data protection, suggested fines have been low relative to “potential maximum fines” of €20 million ($22.2 million) or 4% of annual global turnover, “indicating that we are still in the early days of enforcement.“We expect to see momentum build with more multi-million Euro fines being imposed over the coming year as regulators ramp up their enforcement activity.”

More on Fines

While Ireland’s DPC has failed to fine anyone, the French regulator has seen fit to fine Google €50million for failing to comply with GDPR obligations. Indeed, the French top the rankings for the level of fines imposed (€51mill.), followed by the Germans (€24.5 mill.), and the Austrians (€18mill.). There is no questioning the ability of the DPC to issue fines, but some are beginning to question a willingness to issue fines. In particular, the Italian regulator has taken the opportunity to level some criticism at the perceived lack of action by the DPC in Ireland. That regulator has tabulated figures, which include data from official sources in 30 countries, showing authorities in the EU/EEA imposed 190 fines in 2019.

Italy itself was the most active data protection authority, with 30 actions last year, even though it was one of the lowest in terms of breach notification numbers. The UK was identified as the most punitive with fines totalling €312million representing 76% of all sanctions meted out. Federprivacy chairman Nicola Bernardi said the failure of the Irish Data Protection Commission to issue fines thus far is a concern given the large number of leading tech companies based here. He expressed concerns that technology companies may be treated with more leniency in Ireland than in other jurisdictions and called for greater consistency to be applied across the EU for dealing with sanctions.

                                      So is the criticism justified?

The Irish DPC has 61 statutory enquiries under way, 21 of which are focused on tech multi-national firms. These include Facebook (8), Twitter (3), Apple (3), Google (1) and LinkedIn (1). (See, ‘Data Breaches in Ireland among highest in EU’. Adrian Weckler, Irish Independent, Jan. 20, 2020.) Informed sources have said the Data Protection Commission is in the final stages of its investigation into WhatsApp over possible breaches of EU data privacy rules, with a draft decision expected to be circulated to other authorities to consider within weeks. This is the first of the commission’s many investigations to approach its end point, with delays blamed on complications that arise from pursuing companies that operate cross-border. Verdicts are expected in the Twitter and Whatsapp cases very soon, according to DPC officials. Helen Dixon has distanced herself from any speculation on the amount of fines that may be imposed, while stating that the recent fine of $5billion levied on Facebook in the U.S. by the FTC is unlikely to be repeated here.

 In defence.

 

What is clear to informed data protection professionals such as GDPRXpert is that there are extenuating circumstances that explain the non –imposition of fines to date by the DPC. Undoubtedly, a major contributory factor in non-imposition of fines so far has been the volume and complexity of current investigations. Both of these factors have combined to delay the final verdict. Until there is a final verdict rendered, there can be no announcement of any fine. So any criticism must take account of the quantity, the nature, and the attendant quality of investigations that are still incomplete. As noted earlier, the cross- border nature of many of the investigations adds to the complexity. These particular investigations just take time. Every investigation has to be placed within its own particular context. Going back to the stats on breach notification, we saw that Denmark placed third in the table for breach notifications.  This needs to be viewed in a manner detached from its apparent face value.

 

Many of the breach notifications are related to sending the information of one data subject to the wrong recipient, often in an otherwise secure manner, so the majority of breaches are not severe. It is all too easy to make general assumptions from bare statistics or numbers. Context is crucial to a true understanding. Commenting on the country’s top-three position in the GDPR index, Allan Frank, an ICT security specialist at Datatilsynet, Denmark’s data protection regulator, said: “We don’t see Denmark as more prone to cyber-attack.”Instead, Frank said, the country’s public and private sectors were accustomed to “reporting to public authorities in different matters” – including data breaches – through a single web portal.

Earlier in the blog we saw that France had imposed the highest amount in fines, (almost entirely coming from the Google fine) but yet had a very low ranking for the number of breach notifications per capita. There is no direct relationship between breach notifications and the imposition of fines. It has more to do with the nature of a breach and the particular type of breach. There is no automatic fine for merely communicating a breach. What is more salient is whether there was an outright infringement of the regulation that caused the data breach.

 

“The investigation of cross-border issues is highly complex and takes time to complete, highlighted by the fact that there have been very few decisions with fines issued under the GDPR in relation to cross-border investigations across all 28 EU supervisory authorities since the application of the GDPR in May 2018,” said deputy commissioner Graham Doyle. In principle, regulators can impose fines of 2% or, in some cases 4%, of global turnover. In practice, they will have to judge whether such a heavy penalty would stand up in court, said DLA Piper partner Ross McKean. It’s going to take time – the regulators are going to be wary about going to 4% because they are going to get appealed,” McKean told Reuters. “And you lose credibility as a regulator if you’re blown up on appeal”. Therefore, it seems logical and represents good practice on the part of the DPC to complete the full investigative process before any discussion in relation to fines is broached.

What we are likely to witness in the future will be fines being assessed more quickly in the light of the degree of severity of the failure to comply with obligations under the GDPR. Data breach notifications are often the beginning of the fine process. GDPR was aiming at this from the outset. This is reflected in the framework of the Regulation. For example, Art. 83 sets out the appropriate maximum for fines, based on the nature of the infringement. It establishes a sort of hierarchy of infringements. The overriding factor is that the fines be ‘effective, proportionate and dissuasive’.

Art. 82.3 (a-k) lays out factors and conditions to be considered when making the assessment on the need for, or the appropriate amount of, a fine, if one is to be imposed.  These are categorically delineated and leave few questions. Any fine can be imposed instead of, or in addition to, corrective  measures referred to in points (a) to (h) and (j) of Art.58(2). Art.83 (5) lays down the upper limits of fines for infringement of certain provisions of the regulation. Any non compliance with an order of the DPC can also be subject to a maximum fine of €2million or in the case of an undertaking 4% of total worldwide turnover in the preceding year, whichever is greater.

In relation to the cases before the DPC currently, it is only proper and prudent to leave no stone unturned in any investigation, especially bearing in mind the substantial quantum of fines, for which undertakings in particular may be liable. Informed sources have said the Data Protection Commission is in the final stages of its investigation into WhatsApp over possible breaches of EU data privacy rules, with a draft decision expected to be circulated to other authorities to consider within weeks. “This is the first of the commission’s many investigations to approach its end point with delays blamed on complications that arise from pursuing companies that operate cross-border.”( Charlie Taylor, Irish Times, 20 Jan. 2020, ‘Ireland ranked second in Europe for data breach notifications’)

It seems to GDPRXpert that the DPC is in a kind of ‘no win’ situation. Had the DPC left the ‘big fish’ until later and gone after the ‘smaller fish’, (smaller companies and SMEs) criticism would have been relentless from vested interests. A popular view would have held the DPC lacked the will to challenge Google, Facebook, Apple etc. Yet the DPC was not going on fishing expeditions with the investigations that commenced. There were valid reasons, many stemming from data breach notifications.

Nevertheless, there is a view that holds that the DPC should have had a mixture of investigations in the early days of the GDPR. This would have sent out the message that GDPR compliance is expected from all, not just the ‘big boys’. There is validity to that view, with strong anecdotal evidence suggesting smaller businesses, in particular, have not been giving the GDPR the attention it demands. Many feel the DPC is busy elsewhere. That may be true for now. What may be lost in all of this is that, if a business comes to the attention of the DPC through a data breach, that business will be expected to show what exactly they have done since May 2018! There will be no excuses for a failure to be in compliance so long past the introduction of the GDPR.

Patrick Rowland, GDPRXpert.ie

We are GDPR and Data Protection consultants with bases in Carlow/Kilkenny and Mayo, offering a nationwide service.

For more details visit www.gdprxpert.ie

 

Public Accounts Committee’s Request for Information and GDPR

Last year the Public Accounts Committee sent a request for information to the Dept.of Finance in relation to fees charged to that department by barristers.
In a previous blog, data protection consultants GDPRXpert discussed examples of how the GDPR was used as an excuse for not supplying information, in situations where supplying the information was perfectly legitimate. Some examples showed how ill-informed people were, while others belonged at the farcical and ludicrous end of the spectrum. What we are examining today lies at the more nuanced end. Legitimate positions can be taken by both sides but to repeat what we have stated previously, the GDPR does not exist in isolation. Rather, it is about balancing rights and proportionality. Remember the removal of the visitor books from the heritage sites? If you wish to refresh your memory on this go to this GDPRXpert blog.

BACKGROUND

The Public Accounts Committee

The Committee of Public Accounts (PAC) is) is a standing committee of Dáil Éireann which focuses on ensuring public services are run efficiently and achieve value for money. It acts as a public spending watchdog and by virtue of this role it has become one of the most powerful Oireachtas committees. It has a key role to play in ensuring that there is accountability and transparency in the way government agencies allocate, spend and manage their finances, and guaranteeing that the taxpayer receives value for money. PAC is a standing committee of the Dáil and is responsible for examining and reporting on reports of the Comptroller and Auditor General on departmental expenditure and certain other accounts. It also considers the Comptroller and Auditor General’s reports on his or her examinations of economy, efficiency, effectiveness evaluation systems, procedures, and practices.

Despite a recent adverse court decision relating to questioning of former Rehab Ireland CEO Patricia Kerins, the committee can rightly claim to do an excellent oversight job on behalf of the Irish taxpayer. Our view is clear. That particular episode was caused by some overzealous committee members and an overzealous chairman. ‘Over the top’ is the most appropriate colloquialism to describe the treatment of Ms Kerins. Giving the judgment of the entire court the Chief Justice stated, “the actions of the PAC as a whole were such they condoned the “significant departure” by at least three members of PAC from the terms of its invitation to Ms Kerins to appear before it”. (See Irish Times, 29 May 2019, “Supreme Court says PAC treated Angela Kerins in ‘unlawful’ manner”). The most consistent criticism stemmed from the manner in which PAC acted outside its remit and terms of reference.

Our view is that the PAC performs an excellent oversight job to ensure value for money for the taxpayer. Data protection consultants GDPRXpert.ie were impressed by the committee when it recently had Helen Dixon and some of her staff at a hearing in September of last year (2019). GDPRXpert.ie are making that link available here. At present, the committee has an excellent chairperson in Sean Fleming, and well-briefed committed members.

Apple is  happy to appeal

The Apple Money

There was much criticism from public representatives, the media and the general public when the Government decided to appeal the decision in the Apple case. Indeed, Fintan O’Toole described it as a disastrous miscalculation. The European Commission had found that Ireland had provided €13BN to Apple, which in the opinion of the Commission represented illegal state aid under EU Competition Law. The Commission said Apple’s tax arrangements in Ireland gave it ‘a significant advantage over other businesses that are subject to the same national taxation rules’, violating EU state aid laws. Although the government had indicated back in 2016 its intention to appeal the decision it was still compelled to collect the money owed. Over €14BN (principal amount + interest) was placed in an escrow account by Apple, until the appeal process is concluded. At the end of last year, the government confirmed that over €7Million had been spent on legal fees, consultancy fees, and other related costs.

Money, money, money.

 

Bearing in mind the role of the PAC which we have described earlier, it was to be expected that the committee may have had questions about the use of public money in the context of this appeal. Legal fees formed the bulk of the costs associated with the appeal to date, and the appeal process is still not exhausted. There is a possibility that, depending upon the result from the lower General Court, the case could yet end up before the CJEU and drag on for a few more years. The knowledge that this possibility was real may have augmented the desire of the PAC for some further information on the value for money aspect of the legal fees. The Dept of Finance was responsible for the payment of the legal and other costs associated with the appeal.

The GDPR Perspective

Prior to the introduction of the GDPR there never seemed to be an impediment to the release of legal fees charged by legal teams involved in, for example, the various tribunals over the years. Legal firms were named and their charges were public knowledge (thanks to the terms of reference and /or the FOI Act). A PAC report from January 2011 details how legal fees can reach exorbitant levels and the vast amounts paid to individual legal professionals. Again, there is no surprise and nothing unexpected or unusual in the PAC requesting the information on barrister charges in relation to the Apple appeal.

What is surprising is the response of the Dept. of Finance to this request for information.
A response from the Dept. briefly outlined its reason for its non-compliance with the request for information. In essence, the Dept cited the GDPR as the justification for not acceding to the request. The rationale seems to be very simplistic and dogmatic:
The information is personal data under the GDPR;
We  have a lawful basis to process  personal data but in this case, our advice is not to share  the data;
The  individual right to privacy trumps any right the PAC may have to access the data; and
that’s our story and we’re sticking to it!

Individual’s right to privacy V Public Interest

 

Some possible solutions

Names of tax defaulters are published by the Revenue Commissioners. The commissioners have a clear legal basis for this under the Tax Consolidation Acts. Despite being underpinned by legislation it still represents an interference with privacy rights. Crucially, it is not disproportionate and is done in the public interest. It is arguable that this is much more invasive than a barrister’s fees being disclosed to the PAC. Any barrister doing legal work for govt. departments would expect that their fees could be reviewed by civil servants and others at some point in the future.
There are no confidentiality agreements regarding fees for legal work done for the State. Legal privilege is one thing. Legal confidentiality over fees charged is a whole other thing. Transparency and accountability are overriding factors when it comes to assessing taxpayer value for money spent.

Historically, the practice of disclosing the names of barristers, along with the fees paid to them by Government departments and public bodies, is a longstanding one, and the refusal to disclose similar information represents an unannounced change of practice. Citing the GDPR as the reason for this change of practice is unjustified. The GDPR does not preclude the information on any barrister’s fees being disclosed to the PAC.

....or Public Interest Please

The routes available to the PAC

Art.6 (1) (f) of GDPR provides an appropriate legal basis exists for the PAC to process the personal data concerned, i.e the names and fees charged by individual barristers. It states, “processing is necessary for the purposes of the legitimate interests pursued by a controller or a third party except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject…” Here is a valid reason for the Dept. of Finance to furnish the details. PAC is not a “public authority” for the purposes of the GDPR or the DPA 2018, and so strict limitations on the use of the “legitimate interests” basis do not apply. (See Recital 47, GDPR)
Under s.60 of the Data Protection Act 2018 restrictions are set on the obligations of data controllers and the rights of data subjects for “important objectives of general public interest”. The rights and obligations referred to are those under Arts 12-22 and Art 34 GDPR. S.60 (3) (c) DPA 2018 continues with restrictions where the personal data are kept “by the C&AG for the performance of his or her official functions”.

 

Bearing in mind the role of the C&AG (The C&AG’s mission is to provide independent assurance that public funds and resources are used in accordance with the law, managed to good effect and properly accounted for and to contribute to improvement in public administration) it is proper that the information the PAC is seeking would be available without question to the C&AG from the Dept of Finance. It is certain that the C&AG would look favourably on any request from the PAC for the details of the legal charges they are seeking. There would be a clear understanding by the C&AG of the legitimacy of the request from the PAC. Unlike the action of the Dept of Finance, there would be no hiding behind the GDPR.

If complications and confrontations continue in relation to requests by the PAC for information that contains personal data, there is a longer-term measure that could be utilised. This would involve amending the Data Sharing and Governance Act of 2019. A most appropriate amendment is one that includes the PAC within the definition of “public body”. Personal data from other public bodies could then be shared with the PAC. Appropriate restrictions could be placed on the categories of data to be shared. Data sharing within the amended act would be such that is necessary and proportionate to facilitate the proper functioning of the PAC in “ensuring public services are run efficiently and achieve value for money”.
However, it never should have to come to this. It would not if departments such as the dept. of finance looked at the request in light of the public interest and in the light of the work the PAC does in the public interest. The PAC places transparency and accountability foremost in its quest to ensure public money spent achieves value for money.

In a letter to the PAC, Deputy Commissioner Dara Walsh reiterated a view shared by many within the data protection community. This view is that the privacy interests of individual barristers do not trump or override the public interest in seeing how State money was being spent. “Barristers could have no expectation that the legal fees expended by the DPC as a public body would not be subject to parliamentary and public scrutiny,” he concluded. Furnishing the details of fees to the PAC may also serve to show there is or there is no impropriety involved. Simply put: barrister A is not getting all the work.

Somewhat ironically, Graham Doyle, deputy data protection commissioner, said the DPC was also recently before the PAC and asked about similar payments to third-party organisations and individual service providers, such as barristers. Not only did it provide the information on the companies, but also gave a detailed breakdown on individual barristers, and this was after the introduction of the GDPR (https://www.irishexaminer.com/breakingnews/ireland/state-can-fully-disclose-apple-legal-bill-961631.html ) The commonsense answer suggested by the PAC, and supported by the DPC, is that people tendering for such work be made aware their payments will be publicly disclosed.
P.S. Considering that a general election has just been announced, we will repost a previous blog on the GDPR and elections. It is important that candidates and voters are aware of rights and responsibilities, at a time where personal data are being quickly processed.
Patrick Rowland, GDPRXpert.ie

We are GDPR and Data Protection Consultants, with bases in Carlow/ Kilkenny and Mayo, offering a nationwide service.

For more details visit www.gdprxpert.ie

Latest News