Protecting your business is our business!

GDPRXpert is here to help you navigate through the maze of new elements introduced by the GDPR and the ePrivacy Regulations. GDPRXpert specialises in data protection advice and consultancy, with a strong focus on the General Data Protection Regulation and the ePrivacy Regulations  (S.I. No.336 of 2011),  under which the old ePrivacy Directive , 2002/58/EC, was transposed into Irish law. The Law Enforcement Directive (LED) is a very  specific piece of EU legislation  that is parallel to the GDPR but lies  outside its direct scope. Personal data processing for  law enforcement purposes is properly scrutinised through the lens of the LED, as primarily transposed into Irish law through Part 5, Data Protection Act 2018. Such processing lies outside the scope of the GDPR.  However, in the case of disputes and doubts, much will be interpreted in light of the GDPR, especially the GDPR  principles.

In particular, since the introduction  of the General Data Protection Regulation (GDPR), the data protection regulatory landscape has been irrevocably altered. A seismic shift has taken place because of the increase in the obligations and responsibilities now placed on many organisations. At the same time, the rights of the general public in relation to the processing of personal data have been strengthened.

You must be processing personal data to be subject to the GDPR. If this is the case , organisations involved in the processing of  that personal data now have a higher burden of obligations and responsibilities. Since May 2018,processing of personal data in the context of certain electronic communications ( including , amongst other things,unsolicited electronic communication made by phone, e-mail and SMS) is subject to both the general laws set out out in the GDPR and the more specific laws of the ePrivacy Regulations. The  new EU ePrivacy Regulation is still pending but is unlikely to come into effect before 2021 as negotiations on the draft  text are ongoing.

Our expertise lies in translating all of  these  complex  legal and regulatory requirements into cost effective and practical  operational solutions for your business or organisation. The ultimate goal is compliance with all of the new regulatory requirements and data protection principles. In order for you to reach that goal,  GDPRXpert will ensure you meet  all the  transparency and accountability criteria.



GDPR Introductory Consultation

GDPR Xpert will get you started on the road to compliance with an initial consultation.

Read More

Data Protection Impact Assessments

GDPR Xpert will carry out a full risk assessment based on the GDPR and all associated guidelines.

Read More

Data Protection Officer Outsourcing

GDPR Xpert will carry out the roles, functions & duties of the DPO in an independent manner.

Read More

Data Protection Audits

GDPR Xpert will conduct a comprehensive audit to verify the true level of compliance with the  GDPR.

Read More

Staff Training and Re-Training

GDPR Xpert encourages employee training and /or retraining, that is quite specific to your organisation.

Read More

Privacy Notices/Privacy  Statements

GDPR Xpert will draft a Privacy Notice for your business or organisation that will comply with the  GDPR.

Read More



Maybe your organisation does not process any personal data?

The definition of ‘personal data’ has been expanded under the new Regulation and now covers: …“any information relating to an identified or identifiable natural person(‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”. ( Art. 4(1), GDPR)

Maybe your organisation does process personal data?

This new definition of personal data covers many different types of personal data. If your organisation processes such data then you are subject to new responsibilities and obligations under the GDPR.
GDPRXpert is cognisant of these robust new obligations and responsibilities, and especially their positioning in the new data protection architecture. With a background in law, business and data protection, GDPRXpert is ideally qualified to offer professional advice in this complex area. Once it is clear that your organisation is processing personal data, the prudent thing to do is seek that expert advice. GDPRXpert is here to ensure that you and your organisation are processing that data in accordance with data protection principles, are transparent and accountable, and are compliant with all aspects of the new Regulation.

Remember! We are the GDPR and  Data Protection Experts. You don’t need to be an expert. You just need to be compliant. Our expertise will get you there safely.

Recent Articles

Homepage List
16/07/2024Latest NewsThe DPC decision in the TIK Tok case has been welcomed by data protection practitioners.  It is  is one we have been meaning to discuss but did not get around to it until now. Also, The DPC Annual Report for 2023 has just been released and we will examine it in detail in our upcoming July/August 2024 blog. As the report states, “2023 was a busy year in personal data rights protection. The year saw a significant increase in complaints dealt with by the Data Protection Commission (“DPC”) with record fines issued and corrective orders imposed following cross-border and national inquiries. More generally, there were a large number of data protection-related judgments from the Court of Justice of the European Union and continued domestic focus before the Irish courts”. Perhaps not coincidentally, 2023 was also a busy year for GDPRXPERT and those operating a data protection advisory service. In particular, there was a high demand for the outsourced data protection officer service provided by Naturally, many of the high profile cases taken and concluded by the DPC in 2023 feature prominently in the report. One of these is the Tik Tok case concluded in September 2023. This concluded with a fine of €345 million for TIK Tok. TikTok Is a video focused social media platform that allows registered users to create and share videos of varying duration and to communicate with others through messages. TTL states that TikTok is not a social network and is, rather, a global entertainment platform that, at its core, was designed to enable users to create and share video content, enjoy videos from a variety of creators and otherwise express their creativity, such as by interacting with videos to express new perspectives and ideas. The Tik Tok platform is accessible via a standalone mobile phone application and can also be viewed as a web page from a web browser. This case is worthy of some more analysis and comment because it is a transparent example of the whole process involved , and it is a long and resource intensive process. The Tik Tok case has been welcomed by GDPR and data protection practitioners and many others. Indeed, it has been welcomed by and fellow GDPR and data protection law experts across the EU. In previous blogs here have emphasised the length of time it can take to bring a case to a conclusion. The length of time it can take to finalise investigations has been the subject of much criticism by some groups who themselves are familiar with the process and its complexities. They should know better and they most likely do, but they persist in rushing to premature judgment in some cases and have done so very notably in one specific case. These same groups have their own underlying agenda and their criticism of the DPC may not diminish despite clear robust decisions, such as in the Tik Tok case. One particular group we have criticised before seems to have happily deluded itself into believing it is best placed to monitor and defend the data protection rights of all data subjects. It does this at the cost of neglecting areas within its own more direct remit. Its misguided forays into areas already well served by the DPC has seen it succumb to the desire to take frivolous, and arguably vexatious actions paid for by donors and the Irish taxpayer. In its most recent opportunistic legal venture challenging the DPC in the context of Google’s online bidding, the judge sent a clear message by ordering the group to pay the costs of the DPC. It was seen in this case to be totally out of its intellectual comfort zone and showing signs of a dearth of understanding of data protection law. The Tik Tok case stands as a testament to the quality of the investigations carried out by the DPC. One must emphasise that the Tik Tok investigation was an own volition inquiry but nevertheless involved the coordinated deployment of huge assets and resources. It may be that the nature of an own volition inquiry makes additional resources necessary because of the extra scrutiny they attract by virtue of the GDPR. These types of cases take time; simple as that. This case does demonstrate the multitude of resources that must be strategically expended in order to bring such complex cases to the desired legal conclusion( was subject of Binding Decision 2/2023 on the dispute submitted by the Irish SA regarding TikTok Technology Limited (Art. 65 GDPR) Adopted on 2 August 2023). These cases are complicated even for data protection experts such as Preliminary Issues Before the case proper a few mainly procedural issues had to be clarified. The first was whether the DPC was competent to act as the Lead Supervisory Authority at all. Under 4(23) GDPR cross border processing is defined as meaning either : (a) processing of personal data which takes place in the context of the activities of establishments in more than one Member State of a controller or processor in the Union where the controller or processor is established in more than one Member State; or (b) processing of personal data which takes place in the context of the activities of a single establishment of a controller or processor in the Union but which substantially affects or is likely to affect data subjects in more than one Member State. During the period 29 July 2020 and 31December 2020 ,Tik Tok Technology Ltd (TTL) processed personal data in the context of the activities of a single establishment of a controller or processor in the Union but which substantially affects or is likely to substantially affect data subjects in more than one Member State. TTL’s single establishment in Ireland is supported by affiliated entities in Germany, France, Poland, Italy, Spain and Sweden. There was no doubt that the processing was cross border. Turning to the question of whether the DPC was competent to act as lead supervisory authority in respect of the processing under examination, the DPC noted that Article 56(1) GDPR provides that a supervisory authority of the main establishment of a controller or processor shall be competent to act as lead supervisory authority pursuant to Article 60 GDPR. Having considered all of the above and the nature of the processing at issue, the DPC was satisfied that TTL is a data controller within the meaning of Article 4(7) GDPR regarding the processing which is the subject of the inquiry. The DPC was further satisfied that TTL has its main establishment in Ireland for the purposes of the GDPR. As such, the DPC was satisfied that the requirements of Article 56 GDPR had been met in relation to the processing at issue, such that the DPC is competent to act as the lead supervisory authority in respect of the cross-border processing under examination. So, the DPC is competent to act as Lead Supervisory Authority (LSA). The next hurdle to be crossed concerned the argument by TTL that the standards of compliance to which it was being held post dated the relevant period of the inquiry. The argument was that the “Fundamentals of a Child -Oriented Approach to Data Processing” (published Dec. 2021) were not in effect at the time of the processing giving rise to the inquiry and therefore, constituted “an impermissible retrospective application of regulatory standards and a clear breach of fair procedures.” The DPC dismissed this by relying on the plain fact that GDPR was in force at the time, and something such as the Fundamentals represented ancillary guidance to the GDPR, but TTL was obliged to comply with GDPR since May 2018. The Fundamentals referenced the GDPR principles that were in effect in 2018 and although the Fundamentals were not in effect contemporaneously, they did not constitute any form of retrospective law making. They are post GDPR guidance principles only. The date of their release is immaterial and irrelevant. TTL’s compliance was to be assessed in the light of the GDPR and any guidance notes and material available during the relevant period. Time for Substantive Issues. So then, what was the DPC actually investigating in the case? Material Scope. This inquiry concerned the processing by TTL of personal data of registered child users of the TikTok platform and whether or not TTL had complied with its obligations under the GPR as data controller. The 2018 Act provides that the term ‘child’ in the GDPR is retaken as a reference to a person under the age of 18 years. TTL provides the TikTok platform to persons over the age of 13. As a result, the term child users in this decision should be taken as a reference to registered TikTok users who are aged between 13 and 17 years old. As set out below, this inquiry also examined certain issues regarding TTL’s processing of personal data relating to children under the age of 13. In particular, this inquiry concerned two distinct sets of processing operations by TTL in the context of the TikTok platform, both of which constituted the processing of personal data as defined by Article 4(2) GDPR. The inquiry also examined the extent to which TTL complied with its transparency obligations under the GDPR. As highly experienced data protection law consultants, can testify to the challenges organisations face in meeting transparency standards set by GDPR. The standards are high and will not be met without a strategy being devised and followed. Our GDPR audits often discover that organisations have no strategy at all. Broadly, the first type of processing to be examined relates to the processing of child users’ personal data in the context of the platform settings of the TikTok platform, both mobile application and website based, in particular published by default processing of such platform sessions in relation to Child Users’ accounts, videos, comments, ‘Duet’ and ‘Stitch’, downloading and ‘Family Pairing’. The second type of processing to be examined related to the processing by TTL of the personal data of children under the age of 13 in the context of the TikTok platform, both mobile application and website based, in particular for the purposes of age verification. Finally, with regard to the processing of personal data of persons under the age of 18 in the context of the TikTok platform (including any such processing connection with websites or applications which provide access to the TikTok platform), this inquiry also examined if TTL had complied with its obligations to provide information to data subjects in the form and manner required by Articles 12(1). 13(1)(e), 13(2)(a), 13(2)(b) and 13(2)(f) GDPR. Assessment of TTL’s Compliance with the GDPR and Corrective Powers. The Statement of Issues identified the matters for determination as part of the inquiry. Those issues concerned TTL’s compliance with the GDPR ( and consideration of corrective powers), as follows: Firstly, in relation to platform settings: • A. Whether, having regard to the default public settings applied to Child Users’ accounts, TTL implemented appropriate technical and organizational measures pursuant to Article 24, GDPR, to ensure and to be able to demonstrate that its processing of Child Users’ personal data was performed in accordance with the GDPR; • B. Whether, having regard to the default public settings applied to Child Users’ accounts TTL complied with its obligations under Article 5(1)(c) and 25(1) GDPR to ensure that its processing of Child Users’ personal data was adequate, relevant and limited to what is necessary in relation to the purposes for which they were processed; and to implement appropriate technical and organizational measures designed to implement the data minimisation principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this regulation and protect the rights of data subjects; • C. Whether, having regard to the public default settings applied to Child Users’ accounts, TTL complied with its obligations under Article 25(2) GDPR to implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing were processed; and • D. Whether, in circumstances where TTL’s platform settings allowed an unverified non Child User to access and control a Child Users’ Platform settings, TTL complied with its obligations under articles 5(1)(f) and 25(1) GDPR to ensure that its processing of Child Users personal data was processed in a manner that ensured appropriate security of the personal data, including protection against unauthorized or unlawful processing and against accidental loss, destruction or damage, using appropriate technical organizational measures; and to implement appropriate technical and organisational measures designed to implement the integrity and confidentiality principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects. Secondly in relation to age verification: • Whether, having regard to TTL’s requirement that users of Tik Tok should be aged 13 and above, TTL complied with its obligations under Art. 24 GDPR to implement appropriate technical and organisational measures to ensure and be able to demonstrate that its processing of data of Child Users was performed in accordance with the GDPR, including by implementing measures to ensure against children under 13 years of age accessing the platform ; • Whether, having regard to TTL’s requirement that users of TikTok should be aged 13 and above, TTL complied with its obligations under Art.5(1)(b),5 (1)(c) and 25(1)GDPR to ensure that it collected Child Users’ personal data for specified , explicit and legitimate purposes and that it did not further process that data in a manner incompatible with those purposes; to ensure that its processing of Child Users’ personal data was adequate ,relevant and limited to what was necessary in relation to the purposes for which they were processed; and to implement appropriate technical and organisational measures designed to implement the purpose limitation and data minimisation principles in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects , including by implementing measures to ensure against children aged under 13 having access to the platform; • Whether, having regard to TTL’s requirement that users of TIK Tok should be 13 and over , TTL complied with its obligations under Art. 25 (2) GDPR to implement appropriate technical and organisational measures for ensuring that, by default, only personal data which were necessary for each specific purpose of the processing were processed, including by implementing measures to ensure against children aged under 13 accessing the platform. Thirdly, in relation to transparency: • Whether Child Users are appropriately made aware as a user of Tik Tok of the various public and private account settings in accordance with Art.5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f); to be read in conjunction with Recitals 38,39, 58, 60 and 61, and whether Child Users are able to determine the scope and consequences of registering as a user, and specifically that their profile will be defaulted to public. These were the complicated issues that the DPC had to assess in light of the legal regime.Therefore, the first substantive issue arose in regard to platform default settings and in the specific context of child users. Issue 1: Assessment and consideration of matters concerning TTL’s compliance with articles 5, 24 and 25 GDPR concerning its platform settings for users under age the age of 18. The root question was whether TTL had complied with its obligations under Articles 5(1)(c), 5(1)(f), 24 and 25 GDPR. Article 5(1)(c) GDPR provides that personal data shall be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.” Per Recital 39, this requires, in particular, ensuring that the period for which the personal data are stored is limited to a strict minimum. Personal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means. In order to ensure that the personal data are not kept longer than necessary, time limits should be established by the controller for erasure or for a periodic review. Article 5(1)(f) provides that personal data shall be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures.” Per Recital 39, personal data should be processed in a manner that ensures appropriate security and confidentiality of the personal data, including for preventing unauthorised access to, or use of, personal data and the equipment used for the processing. Further, Article 24(1) provides: Taking into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation. Those measures shall be reviewed and updated where necessary. The GDPR and data protection advisory service provided by always emphasises the importance attached to the technical and organisational measures that are implemented. It is not enough to implement measures; they must be demonstrably ‘effective’. To quote Recital 74 GDPR, “ The responsibility and liability of the controller for any processing of personal data carried out by the controller or on the controller’s behalf should be established. In particular, the controller should be obliged to implement appropriate and effective measures and be able to demonstrate the compliance of processing activities with this Regulation, including the effectiveness of the measures. Those measures should take into account the nature, scope, context and purposes of the processing and the risk to the rights and freedoms of natural persons.” GDPR audits conducted by will never overlook this potential pitfall. Unfortunately, it is missed by many. In regard to Art.25, the European Data Protection Board (EDPB) has published Guidelines on Data Protection by Design and Default. These have summarised Art.25 GDPR as follows: “ The core of the provision is to ensure appropriate and effective data protection both by design and by default, which means that controllers should be able to demonstrate that they have the appropriate measures and safeguards in the processing to ensure that the data protection principles and the rights and freedoms of data subjects are effective. ( European Data Protection Board, ‘Guidelines 4/2019 on Article 25 Data Protection by Design and by Default’ (20 October 2020) at ). As with the technical and organisational measures referred to above, it is the effectiveness of the implemented measures that is crucial. Each implemented measure should produce the intended results for the processing foreseen by the controller and this has two consequences as laid out in the EDPB Guidelines: “First, it means that Article 25 does not require the implementation of any specific technical and organisational measures, rather that the chosen measures and safeguards should be specific to the implementation of data protection principles into the particular processing in question. In doing so, the measures and safeguards should be designed to be robust and the controller should be able to implement further measures in order to scale to any increase in risk. Whether or not measures are effective will therefore depend on the context of the processing in question and an assessment of certain elements that should be taken into account when determining the means of processing. …Second, controllers should be able to demonstrate that the principles have been maintained.” On a daily basis our data protection law consultants here at are reminded of the interaction of the necessary congruence of technical and organisational measures that are incorporated into processing operations with data protection principles, especially data minimisation. Article 25(2) GDPR requires data controllers to implement measures to ensure that, by default, the principle of data minimisation is respected, as follows: “The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.” The obligation to implement the measures described in Art 25(2)GDPR is referred to as Data Protection by Default. By default, personal data must not be made accessible without the individual’s intervention to an indefinite number of natural persons. As we often remind clients; no specific measures are prescribed by GDPR. If you have a toothache you want to take something to kill that pain but experience will tell you the same medication may not be effective against pain in another location. It is, therefore, controllers who know the appropriate measures that historically have worked. They should assess the risk and implement what will work for an increased risk in the future.       TTL responded to the Art. 5.(1) (c) and submitted that the data minimisation principle did not mandate processing the absolute minimum but rather, what was adequate, relevant and limited to what was necessary in relation to the purposes of the processing and which respected the principle of proportionality. Their submission on Art.24 was detailed and focused mainly on a few core points. The first was that Arts.24(1) and 24(2) did not impose prescriptive obligations as that would be inconsistent with the objectives of the provisions themselves, which is to embed privacy compliance practices into the internal practices of an organisation in a manner appropriate to the processing activities undertaken by a particular organisation. TTL cited Art.29 Working Party as support for the argument that a one size fits all approach would force controllers into structures that are ‘unfitting and ultimately fail’. Tik Tok took the view that the only option was ‘custom built solutions. By reasonable interpretation, the accountability obligations under Art.24(1) could only be open-ended and non-prescriptive. TTL also argued that the GDPR does not mandate the exact means of achieving or demonstrating compliance with its requirements. Taking such a prescriptive approach would be inconsistent with the core objective of Art.24 which, as we have just stated, is to embed privacy compliance into the internal practice of organisations in a manner that works for each organisation while remaining aligned with GDPR principles. Data protection consultants would agree with TTL that with Art.24 there is a much more holistic approach to data protection compliance and indeed, there has to be. If not, there would have to be a formulaic, universal manner by which an organisation had to display compliance. We know that under Art.24 GDPR the controllers are required to implement technical and organisational measures to ensure that processing is carried out in accordance with the GDPR and be able to demonstrate such compliance. In order to meet the requirement controllers must make an assessment of (1) the nature, scope , context and purposes and (2) the risks of varying likelihood and severity for the rights and freedoms of natural persons. TTL contended therefore that there was sufficient leeway within the confines of the Article to allow for some discretion as to what was an appropriate measure. They further argued that it is clear, therefore, that the appropriateness of the measures adopted must be informed by an assessment of the context and purposes of processing, as well as the risks which may result from the processing (if any). As explained above, TikTok’s mission is to inspire creativity and bring joy. The core purpose of the Platform during the Relevant Period was to enable Users to disseminate their own content and to show Users content they are likely to find of interest. TikTok then moved to their position in relation to Art.25 and stated “ Article 25(1) GDPR does not solely focus on user controlled settings as a technical measure but also addresses technical measures more broadly (including ones that are not user controlled) and organisational measures. As such, TikTok as a data controller is afforded autonomy and appropriate latitude in determining the specific designs of its product.” The measures to be adopted in Article 25(1) GDPR should be commensurate with the risks posed by the processing, and those risks should be weighed by their likelihood and severity. TTL then looked to opinions from the EDPB to bolster their argument. “The European Data Protection Board (“EDPB”) Article 25 Data Protection by Design and Default Guidelines (“Article 25 Guidelines”) recognise that Article 25(1) GDPR does not envisage a one-size fits all approach to data protection. The EDPB Article 25 Guidelines further state “hen performing the risk analysis for compliance with Articles 25, the controller has to identify the risks to the rights of data subjects that a violation of the principles presents and determine their likelihood and severity in order to implement measures to effectively mitigate the identified risks.” Article 25(2) states, among other things that “the controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed”. Article 25(2) requires that, by default, only the personal data that is necessary for each specific purpose is processed. It is the responsibility of the controller to define the purpose of the processing, and by doing so, the controller also determines the scope of the processing required for that particular purpose. Article 25(2), therefore, requires implementing default settings to processing that is necessary to achieve the controller’s purpose. Article 25(2) is not prescriptive as to the type of technical and organisational measures that must be implemented to ensure data protection by default. The EDPB has recognised that a range of different measures, including enabling data subjects to intervene in the processing, could be involved “depending on the context and risks associated with the processing in question”. The context of the processing is central to the consideration as to what measures are appropriate in the given circumstances and to what extent they will implement data protection principles effectively. In particular, Article 25(2) does not require controllers to choose default settings which would subvert the core functionalities of their service. In order to comply with Article 25(1) GDPR, controllers are asked to weigh a multitude of broad and abstract concepts, assess the risks, and then determine “appropriate” measures. Each of these elements is opaque and open to interpretation, and as a result, no two assessments performed in accordance with Article 25 will look the same. Article 25(1) requires “appropriate” measures, which when applied to age verification would mean that a controller is required to implement measures to determine the age of users with an appropriate, rather than absolute, level of certainty. Article 25(1) GDPR does not prescribe the appropriate technical and organisational measures designed to implement the data protection principles (including the data minimisation principle) that organisations are required to put in place. Controllers are similarly afforded autonomy and appropriate latitude under Article 25(2) GDPR in determining the appropriate measures for ensuring privacy by default. The European Data Protection Board (“EDPB”) in its Article 25 Data Protection by Design and by Default Guidelines (“Article 25 Guidelines”) explains that being appropriate means that the measures and necessary safeguards should be suited to achieve the intended purpose, i.e. they must implement the data protection principles effectively” and that “the controller must verify the appropriateness of the measures for the particular processing in question”. Further, in considering whether the measures put in place by TikTok complied with Article 25(1) GDPR, account must be taken, in particular, of the “context and purposes of processing”. In this regard, full consideration must be given to the benefits of the relevant features to Users and their importance to the core purpose of TikTok during the Relevant Period as described above, which would have informed younger Users’ expectations, and the measures and privacy settings designed to safeguard younger Users. All of these submissions had then to be taken into account by the DPC. The first finding by the DPC At the time of the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. In this regard, the DPC was of the view that TTL failed to implement appropriate technical and organisational measures to ensure that, by default, only personal data which were necessary for TTL’s purpose of processing were processed. In particular, this processing was performed to a global extent and in circumstances where TTL did not implement measures to ensure that, by default, the social media content of Child Users was not made accessible (without the user’s intervention) to an indefinite number of natural persons. It was held by the DPC that the above processing by TTL was contrary to the principle of data protection by design and default under Article 25(1) and 25(2) GDPR, and contrary to the data minimisation principle under Article 5(1)(c) GDPR. The second finding by the DPC: During the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. The above processing posed severe possible risks to the rights and freedoms of Child Users. In circumstances where TTL did not properly take into account the risks posed by the above processing, the DPC took the position that TTL did not implement appropriate technical and organisational measures to ensure that the above processing was performed in accordance with the GDPR, contrary to Article 24(1) GDPR. The third finding by the DPC During the Relevant Period, TTL implemented a platform setting – called ‘Family Pairing’ for Child Users whereby a non-Child User could pair their account to that of the Child User. This platform setting allowed the non-Child User to enable direct messages for Child Users above the age of 16. The above processing posed severe possible risks to the rights and freedoms of Child Users. In circumstances where this processing does not ensure appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures; and TTL failed to implement appropriate technical and organisational measures designed to implement the integrity and confidentiality principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects. DPC took the view that this processing was not performed in accordance with the GDPR, contrary to Article 5(1)(f) and Article 25(1) GDPR. ( at p.42) The DPC also made an assessment and consideration of matters concerning age verification pursuant to articles 24 and 25 GDPR The fourth finding by the DPC. During the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. The above processing posed severe possible risks to the rights and freedoms of Child Users. This also posed several possible risks to the rights and freedoms of children under the age of 13 who gained access to the platform. In circumstances where TTL did not properly take into account the risks posed by the above processing to children under the age of 13, the DPC was of the view that TTL did not implement appropriate technical and organisational measures to ensure and to be able to demonstrate that the above processing was performed in accordance with the GDPR, contrary to Article 24(1) GDPR. The DPC also examined transparency requirements under Arts. 5, 12, and 13 GDPR but at this stage we will not go through all of TTL’s submissions and we will conclude with the transparency obligations and the remaining DPC findings. The first transparency obligation for consideration was whether Child Users were appropriately made aware (in a concise, transparent, intelligible and easily accessible form, using clear and plain language) by TTL as a user of the TikTok platform of the various public and private account 239. settings in accordance with Articles 5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f) GDPR; to be read in conjunction with Recitals 38, 39, 58, 60 and 61 GDPR, and whether Child Users are able to determine the scope and the consequences of registering as a user, whether public or private. The second transparency obligation for consideration was whether Child Users were appropriately made aware by TTL as a user of the TikTok platform of the public default setting in accordance with Articles 5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f) GDPR; to be read in conjunction with Recitals 38, 39, 58, 60 and 61 GDPR, and whether Child Users were able to determine the scope and the consequences of registering as a user, and specifically that their profile would be defaulted to public. Finding 5 In circumstances where TTL did not provide Child Users with information on the categories of recipients or categories of recipients of personal data, DPC found that TTL has not complied with its obligations under Article 13(1)(e) GDPR. In circumstances where TTL did not provide Child Users with information on the scope and consequences of the public-by-default processing (that is, operating a social media network which, by default, allows the social media posts of Child Users to be seen by anyone) in a concise, transparent and intelligible manner and in a form that is easily accessible, using clear and plain language, in particular insofar as the very limited information provided did not make it clear at all that this would occur, the DPC found that TTL had not complied with its obligations under Article 12(1) GDPR. Finding 6: For the reasons established by the EDPB in the Article 65 Decision, TTL infringed the principle of fairness pursuant to Article 5(l)(a) GDPR. TTL were ordered to bring their processing into compliance, received a reprimand and were fined in total €345 million. We have done a sort of synopsis of the main issues in this case but it is impossible to fully do justice to the effort on the part of the DPC to uphold and vindicate the rights of data subjects, often achieved despite the criticism levelled at the office from the usual suspects. However, it should give readers a glimpse into the complexity of some of the cases that land on the desk of the DPC. offers a comprehensive  data protection consultancy service with particular emphasis on the onerous responsibilities placed on organisations under the GDPR. Patrick Rowland for [...]
08/03/2023Latest NewsThere is no doubt that the office of the DPC has moved from a  GDPR guidance mode to a GDPR enforcement mode. It is hardly surprising considering the GDPR has now been in effect for over 4 years. This shift in focus  is partly related to the reality that many high profile investigations are largely complete, notwithstanding the likelihood of future appeals that will take up more time. GDPR and data protection law experts,, are aware of increased contact from the office of the DPC to organisations of varying size. In general, these contacts are the result of complaints from  members of the public who now seem more knowledgeable about their rights under the GDPR and general data protection law. Undoubtedly, there  has been a consequential increase in complaints to the DPC, most especially emanating from individuals fearing breaches of their rights. An intense and sustained GDPR awareness building programme by the DPC has been very successful, with the result individuals are very knowledgeable concerning their GDPR rights and the responsibilities and obligations of data controllers. Anecdotal evidence suggests that security of personal data processing has been a consistent element of complaints. The very essence of any right to data protection is that a mechanism must be provided to ensure a data subject’s personal data are   adequately protected. We now know that the GDPR takes a pro-active risk based approach to force the data controller to implement measures, in order  to minimise risks to personal data. It is seen in Art.24(1) GDPR that the controller must implement appropriate organisational and technical measures to ensure and demonstrate compliance with the GDPR. It also stipulates constant review and updating. Data breaches rank high on the risk scale. Art.32(2) GDPR stipulates that in assessing the appropriate level of security  account shall be taken in particular of the risks presented by processing, Risks are always present but,  especially “ from accidental or unlawful destruction,  loss , alteration , unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed”. These risks mandate a level of security appropriate to the harm that might result. Recital 75 GDPR lays out  what is a comprehensive  summation of the risks that the GDPR hopes to manage. Such a list can not be an exhaustive one as data processing technology is changing and evolving all the time. We know the onus is on the controller (from Art. 24 GDPR) to determine the risks of varying likelihood and severity for the rights and freedoms of natural persons. Various risks to rights and freedoms may result from data processing that leads to physical, material or non material damages. As referred to earlier in the text, Recital 75 sets out a mixture of potential  ultimate consequences. These include, in particular, where the processing may give rise to discrimination, identity theft or fraud, financial loss, damage to reputation, loss of confidentiality  of personal data protected by professional secrecy, unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantages. There is the potential for deprivation of many rights and freedoms and a loss of control over one’s own data. Personal data may be processed and reveal all sorts of sensitive data including health, religion, political opinion, racial or ethnic origin, genetic data and many more. Determining these risks is aided by reference to the nature, scope, context and purposes of the processing. These criteria ,in conjunction with the knowledge of the available technology and the cost of implementation of up to date technology, aid the controller in deciding what is ‘an appropriate level’ of security. The level of security must also be balanced, in the sense that it is proportionate to the perceived/assessed risks to the rights and freedoms of natural persons. As the risk assessment determines the appropriate level of security to be incorporated into the process,  it must be of a high quality and leave nothing to chance. Two very basic requirements must be met: Any assessment must be objective; and must Take account of the likelihood and severity of the risk; this has to include an assessment of the risks that arise from the personal data processing itself, and any risks that would arise in the case of an actual data breach. Any objective assessment must at the start incorporate an analysis of the characteristics of the processing activity. Account needs to be taken of, for example; The origin of the data; Nature of the processing activity; The scope of the processing activity The context and purpose of the processing; The likelihood and severity of risks; Purpose of the processing activity ; and the identification of the best technical and organisational measures to mitigate such risks. Looking at all these aspects will help the data controller establish the level of risk involved in all data processing operations. Best practice would dictate a prudent controller maintains a record that details processing operations, associated risk ( risk levels will be assessed later) and measures taken  to address the risks identified. The objective risk assessment also  forms part of the required information to maintain the record of processing activities under Art. 30 GDPR. Take the example below:   Processing Operation External payroll for employees wages Associated risk Payroll service might have lax data protection procedures Measures taken to mitigate risk Assessment on payroll service found high data protection security in place Conclusion No apparent risk   Somewhat ironically, the best way of assessing risks is to look at actual causes of data breaches incidents.  Data controllers strive to avoid data breaches but the fact is they provide  best evidence of the risks inherent in some processing operations! An assessment of these characteristics should contribute to establishing whether the particular data processing operation involves any risk or if there is a risk, whether it is high or low in nature. The Regulation also recognizes these risks when processing personal data and places the responsibility on the controller and the processor in Art. 32(1) of the General Data Protection Regulation to implement appropriate technical and organisational measures to secure personal data. The GDPR deliberately does not define which specific technical and organisational measures are considered suitable in each case, in order to accommodate individual factors. However, it gives the controller a catalogue of criteria to be considered when choosing methods to secure personal data. Those are the state of the art, implementation costs and the nature, scope, context and purposes of the processing. In addition to these criteria, one always has to consider the severity of the risks to the rights and freedoms of the data subject and how likely those risks could manifest. This basically boils down to the following: The higher the risks involved in the data processing and the more likely these are to manifest, the stronger any security measures that are taken have to be and the more measures must be taken. (Data controllers and data processors are also obliged to ensure that their staff and “other persons at the place of work” are aware of security measures and comply with them. The legal obligation to keep personal data secure applies to every data controller and data processor, regardless of size. While most of the information below applies to any such organisation, some aspects would only apply to a larger organisation using an IT network. Issues that data controllers and data processors should consider when developing their security policies: ( much of the information in the next few sections below is from the DPC guidance note ‘Guidance for Controllers on Data Security’ available at  ) Access control A data controller has a duty to limit access to personal data on a “need to know” basis. Greater access limitations or controls should apply to more sensitive data. A data controller must be aware of the different users who access their systems/records and their requirements. The different types of users could include: staff at various seniority, operational or responsibility levels; third party contractors/data processors; customers; and business partners The different requirements of each of these types of users has to be considered. It should not be  a ‘one size fits all’  approach but rather access privileges being directly measured to meet the requirements. The nature of access allowed to an individual user should be set and reviewed on a regular basis. It should go without saying that individual staff members should, among other things, only have access to data which they require in order to perform their duties. Shared credentials (multiple individuals using a single username and passwords)  should not be tolerated. Specific procedures sometimes referred to as a “movers, leavers and joiners” policy are required in all organisations with access to personal data to decide when to maintain, increase or restrict previous access where a user role changes. Access control must be supported by regular reviews to ensure that all authorised access to personal data is strictly necessary and justifiable for the performance of a function. IT administrator accounts with unrestricted access to personal data warrant special attention. Policies should be in place in regard to vetting and oversight of the staff members allocated these accounts. A staff member with such responsibilities should have separate user and administrator accounts. Multiple independent levels of authentication may be appropriate where administrators have advanced or extra access to personal data or where they have access or control of other’s account or security data. All organisations big and small must guard against potential downloading of personal data from the organisation’s own systems. This has to be strictly controlled. Such downloading can be blocked by technical means (disabling drives, isolating network areas or segments, etc). Many organisations have taken a decision to block access to USB ports having examined the inherent risks involved in leaving such ports open by default for all users. Access authentication Users should have a unique identifier, such as a password, passphrase, smart card, or other token, to allow access to personal data. These are just examples, not an exhaustive list; for example, a biometric (e.g. a fingerprint, voice or retina scan) can also be used as a unique identifier. However, as biometrics in themselves raise serious data protection and privacy issues, their use should only be considered where other authentication methods are demonstrably insufficient. Passwords Passwords are a word or string of characters. A strong password should include a minimum of twelve characters (the longer the password, the harder it is for a computer to guess) and may contain one or more of the following: letters (upper and lower case); symbols (e.g. &, *, @, €, $, etc.); numbers ( 0 – 9 ); and punctuation (?, “, !). However, users should not be required to use a mix of many types of character, as a strong password can be created using only one type of character (e.g. letters) once it is sufficiently long and hard to guess (for computers as well as people). Passwords should be reasonably easy for the user to remember but very difficult for anyone else to guess. Examples might include: M1_s?n, “The_^v1at#r”! (based on ‘My son, “the aviator”!’ with random characters replacing certain vowels or other letters) Te@m5Rb@dp@55word5 (based on ‘Teams are bad passwords’ with numbers and symbols replacing certain letters) Please do not use these examples as actual passwords! Passwords should not contain values known to be commonly-used or expected in passwords, or those which have been compromised. For example, users might be limited from using passwords which include but not limited to: Passwords obtained from previous breaches; Dictionary words; Repetitive or sequential characters (e.g. ‘aaaaaa’, ‘1234abcd’); Context-specific words, such as the name of the service, the username, or derivatives thereof. Passphrases Passphrases are similar to passwords, but represent a sentence or sequence of words. They should include twenty characters or more and may also include symbols, numbers and punctuation marks, e.g. “I Love the musical, The Sound of Music 2!” Ilike2swim@thelocalswimingpool Data controllers should enforce password complexity and length, such as through rules that ensure that weak passwords and reused passwords are rejected. Users should not be required to change their password or passphrase arbitrarily (e.g. too frequently), as this can actually reduce password security (for example, by increasing reliance on simple passwords or reusing passwords). However, users should be required to change their password or passphrase if there is evidence it has been compromised or revealed, or when there is some other change in risk. Data controllers should never store users’ passwords as plain text but should use strong and irreversible cryptographic hashing and salting to protect them and to allow secure checking for login purposes. Data controllers should ensure that users are made aware that their password/passphrase is unique to them and must not be disclosed to anyone else. Shared credentials (where multiple users use the same login and password) should never be permitted. Vendor supplied defaults for system passwords and other security parameters should never be left in place. Data controllers must ensure that partner organisations with access to their systems or personal data respect these controls. Where possible, data controllers should promote password diversity by reminding users of the risks associated with password reuse across other internet services. Multi-Factor Authentication Multi-factor authentication (MFA) refers to there being more than one identity factor employed for access authentication. A commonly used option in many services is ‘2FA’, which means that two factors for authentication are used. For example, instead of just using a password of their choosing, a user may have a second factor such as a biometric (e.g. a fingerprint scanner), or an “out-of-band” or alternative communication channel send a passcode to a secondary email address, phone number, or device. It should be noted, however, that some of these secondary channels are more secure than others Devices such as smart cards or tokens, as well as standalone mobile apps, can be used as part of MFA, to provide authentication either by generating a code to be entered or containing a chip that authenticates with the system being accessed. They may generate a PIN number that is valid for a very short period of time. This is used in conjunction with a username and password to authenticate the user, and can reduce the risk of ‘brute force’ password attacks or attacks where passwords have been stolen. Automatic Screen Savers Most systems allow for screensavers to activate after a period of inactivity on a computer, requiring a password to re-establish access. This automatic lock activation is useful as the alternative manual locking of a workstation requires positive action by the user every time he/she leaves the computer unattended. Regardless of which method an organisation employs, computers should be locked when unattended. This applies not just to computers in public areas, but to all computers. It is pointless having an access control system in place if unattended computers may be accessed by any staff member, or where a shared password is used. Encryption. Encryption as a concept is explicitly mentioned as one possible technical and organisational measure to secure data in the list of Art. 32(1) of the GDPR, which is not exhaustive. Again, the GDPR does not mention explicit encryption methods to accommodate for the fast-paced technological progress. When choosing a method one must also apply the criteria catalogue above. To answer the question of what is currently considered “state of the art” data protection officers usually rely on the definitions set out in information security standards like ISO/IEC 27001 or other national IT-security guidelines. Companies can reduce the probability of a data breach and thus reduce the risk of fines in the future, if they chose to use encryption of personal data. The processing of personal data is naturally associated with a certain degree of risk. Especially nowadays, where cyber-attacks are nearly unavoidable for companies above a given size. Therefore, risk management plays an ever-larger role in IT security and data encryption is suited, among other means, for these companies. In general, encryption refers to the procedure that converts clear text into a hashed code using a key, where the outgoing information only becomes readable again by using the correct key. This minimises the risk of an incident during data processing, as encrypted contents are basically unreadable for third parties who do not have the correct key. Encryption is the best way to protect data during transfer and one way to secure stored personal data. It also reduces the risk of abuse within a company, as access is limited only to authorised people with the right key. As with passwords, this measure is pointless unless the key to decrypt the data is kept secure. Encryption of personal data has additional benefits for controllers and/or order processors. For example, the loss of a state of the art encrypted mobile storage medium which holds personal data is not necessarily considered a data breach, which must be reported to the data protection authorities. In addition, if there is a data breach, the authorities must positively consider the use of encryption in their decision on whether and what amount a fine is imposed as per Art. 83(2)(c) of the GDPR. Anti-Virus Software Anti-virus software is not only required to prevent infection from the internet (either email or web-sourced) but to prevent viruses that may also be introduced from portable devices, such as memory sticks (the use of which should be strictly limited). No antivirus package will prevent all infections, as they are only updated in response to infections. It is essential that such software is updated on a regular basis and that policies support vigilance in regard to potential threats. A policy of not opening email attachments from unexpected sources can be a useful way of preventing infection. Firewalls A firewall is essential where there is any external connectivity, either to other networks or to the internet. It is important that firewalls are properly configured, as they are a key weapon in combating unauthorised access attempts. The importance of firewalls has increased as organisations and individuals avail of “always-on” internet connections, exposing themselves to a greater possibility of attack. Software Patching Patches are the latest updates from the creator of your operating system software or application software. They usually contain fixes to potential security concerns and can be an important tool in preventing hacking or malware attacks. Organisations should ensure that they have regular, consistent and comprehensive patch management procedures in place. Where possible, before installing the very latest patches, it is good practice to install these patches in a test environment to ensure that the patches do not create other issues with your systems. A record should also be kept of the date and patch installed on a system. Remote Access Where a staff member/contractor is allowed to access the network from a remote location (e.g. from home or from an off-site visit), such access creates a potential weakness in the system, not least when accessed from a wireless network. For this reason the need for such access should be properly assessed and security measures reassessed before remote access is granted. If feasible, the access should be limited to specific IP addresses. Security should be the first consideration in granting access to partner organisations. Technical security measures, security assessments, contractual agreements in line with the requirements of the GDPR and the Data Protection Act 2018, and agreed standards of management of shared assets are all important aspects in managing this risk. It is the responsibility of the data controller to ensure that, regardless of the means by which a user remotely accesses their system, the security of the system cannot be compromised. Multifactor authentication for such access should be considered in this context. Wireless Networks Access to a server by means of a wireless connection can expose a network to attack. The physical environment in which such systems are operated may also be a factor in determining whether weaknesses in the system security exist. As with remote access, wireless networks should be assessed on security grounds rather than solely on apparent ease of use. Data controllers must ensure that adequate security is in place on the network through, for example, appropriate encryption measures or specification of authorised devices. Particular vulnerabilities are associated with the use of third party unsecured WiFi networks (e.g. those provided in airports, hotels, etc.). A device using such a network may be open to attacks from other machines on the network. A good firewall should be installed on the portable device to prevent such attacks. The device should only connect to the network when necessary. When using unsecured WiFi to transmit personal or sensitive data, a secure web session should be in place to protect the data. Portable Devices Laptops, USB keys, smartphones, and other forms of portable device are especially vulnerable to theft and accidental loss. Where a data controller considers it essential to store personal data on a portable device, these devices should be encrypted. Whole disk encryption should be used to mitigate against storage of files outside of an encrypted segment of the disk. In the case of smartphones, a strong password should be required at start up and also after several minutes of inactivity. When such a device is lost steps should be taken immediately to ensure that the remote memory wipe facility is activated. Staff allocated such devices should be familiar with the relevant procedures. Logs and Audit Trails Access control systems and security policies are undermined if the system cannot identify abuses. Consequently, a system should be able to identify the user name that accessed a file and the time of the access. A log of alterations made, along with author / editor, should also be created. Logs and audit trails can help in the effective administration of the security system and can deter staff members tempted to abuse the system. Staff should be informed that logging is in place and that user logs are regularly reviewed. Monitoring processes should focus not only on networks, operating systems, intruder detection systems and  firewalls, but should include remote access services, web applications and databases. Logging systems can generate lots of information and an automatic means such as a System Information Event Monitor (SIEM) to filter and alert security staff about irregular audit trail entries may assist in its effective use. An intruder detection system (IDS) acts as an internal alarm system that monitors and reports on malicious activities on a network or system. Such systems also aim to detect attacks that originate from within the system. Any organisation processing large volumes of personal data should have an IDS deployed and activated. Where alerts/events are generated by any such systems there must be a meaningful system in place to examine them in a timely fashion. This is to assist in identifying unusual activity and take immediate corrective action if there is an ongoing breach of security. Back-Up Systems A back-up system is an essential means of recovering from the loss or destruction of data. While some system should be in place, the frequency and nature of back up will depend, amongst other factors, on the type of organisation and the nature of data being processed. The security standards for back-up data are the same as for live data. Incident Response Plans Even with the best designed systems, mistakes can happen. As part of a data security policy, an organisation should anticipate what it would do if there were a data breach so that it can be ready to respond. Some questions you might ask yourself: What would your organisation do if it had a data breach incident? Have you a policy in place that specifies what a data breach is? (It is not just lost USB keys/disks/laptops. It may include any loss of control over personal data entrusted to organisations, including inappropriate access to personal data on your systems, or the sending of personal data to the wrong individuals). How would you know that your organisation had suffered a data breach? Does the organisation’s staff (at all levels) understand the implications of losing personal data? Has your organisation specified whom staff tell if they have lost control of personal data? Does your policy make clear who is responsible for dealing with an incident? Does your policy cover the requirements of mandatory breach reporting (where applicable) under the Data Protection Act 2018, the GDPR, and/or the ePrivacy Regulations (SI 336/2011) (including new availability and resilience requirements)?   The Human Factor No matter what technical or physical controls are placed on a system, the most important security measure is to ensure that staff are aware of their responsibilities. Passwords should not be written down and left in convenient places; passwords should not be shared amongst colleagues; and unexpected email attachments should not be opened unless first screened by anti-virus software. Effective employee training about the risks of data compromise, their role in preventing it and how to respond in the event of problems can be a very effective line of defence. Many organisations set security policies and procedures but fail to implement them consistently. Running scenario based training sessions may assist in effective training.   Controls focused on individual and organisational accountability and ensuring that policies are carried out are an important part of any system designed to protect personal data. Identify essential controls first and ensure that these controls are implemented across the organisation without exception. Once this is in place, move on to more advanced controls designed to mitigate the risks specific to the organisation and the type(s) of data processed. Data controllers must have procedures in place to manage staff turnover, including retrieval of data storage devices and quick removal of access permissions. Many data breaches have a very avoidable cause. It is wise to start looking at the at the simple things before focusing on the more complex. are GDPR and data protection law consultants offering expert advice on all aspects of data protection law compliance. Remember what we have just stated. The guidance and education mode from the DPC has changed and moved to enforcement. Patrick Rowland at [...]
06/06/2022UncategorisedThe  DPC is  under pressure from critics with varied agendas. The backdrop for all of  this has to be criticism from late last year,  especially the criticism of the handling of some cases investigated by the DPC . Much of this criticism, perhaps unexpectedly, came from other EU  Data Protection authorities, and more from journalists, commentators, interested actors ( some perhaps with questionable motives) and data protection practitioners. In the foreground will be the DPC’s recently announced 2022-2027  regulatory strategy, which is timely in light of all the criticism  levelled at the DPC of late. An annual report  from the DPC is also  informative and its timing is helpful to the public and for the morale of staff at the DPC. It is available here for your convenience. In some opening remarks to the new regulatory strategy the DPC acknowledges that  some “challenges, against a backdrop of hugely increased public consciousness of data protection, have given rise to ambiguities of interpretation and application of the law that the DPC – along with its peer data protection authorities – must work to clarify”. The regulatory strategy is also “being implemented in the very early years of radically reformed data protection legislation – in the form of the GDPR and ancillary Law Enforcement Directive – along with all the attendant interpretative challenges that such immense regulatory change usually produces”.   The DPC “recognises that it cannot achieve its ambitions alone – new partnerships and new ways of engaging will be necessary as we look towards a future of closer convergence. Nonetheless, the DPC builds from a position of confidence: we are a Regulatory office with ambition, a clear sense of purpose, a history of achievement, and a future of considerable promise”. This last sentence will be questioned by many and it will  irritate even more. It can hardly be stated with any conviction  that the DPC builds from a position of confidence and a history of achievement.  If anything, maybe it has underachieved, but that is for another time and forum. Another sentence,  “The DPC is of the belief that compliance in general will be greatly improved when stakeholders are clear in their understanding of how the law is enforced”, holds the essence of much of the criticism levelled at the DPC. Stakeholders are not clear in their understanding of how the law is enforced and  many of these stakeholders are other EU Data Protection authorities who have their own  expertise in knowing and applying the GDPR, in particular. A stated goal in the new strategy is to bring clarity to stakeholders. This is  easily stated as a goal to aim for, but again, sure to irritate many . Data protection law experts GDPRXPERT .ie  take the view that  the direct and primary  strategy to achieve this goal will not be stated as easily. Two aspects will be key: having a clear goal, and knowing the most effective route to get to that goal. Compromises will  be needed along the way as there are so many stakeholders involved.  Ultimately, although the most effective route may well be signposted, there will be necessary diversions along the way in the interest of overall stakeholder consensus. Pragmatism has to guide any strategy where there are legitimate and valid  competing interpretations of any regulation. In this context, the co-operation and consistency mechanism under GDPR is a clear example of necessary deviation from a legitimate route to a destination. GDPRXPERT previously looked at issues in relation to the workings of the  office of the DPC ( see We can say that some  of the more recent  criticism levelled at the office of the DPC  is unjustified, some is justified, and more is premature. There is premature criticism because  the CJEU still is going to have to interpret some aspects of the GDPR  so that at least there is more clarity ,if not 100% certainty,  in relation to some contentious aspects of the GDPR. It is not surprising that some of the criticism of the DPC  consistently emanates from the same sources, and one has to consider the possibility that some of these sources have their own particular agendas. Some of these  agendas  have very little to do with bolstering the  data protection environment  for any data subject. These agendas are more  to do with  using supposed concerns about  data protection as a  shallow conduit to increase their own profiles.  Repetitive sources that spring to mind include Max Schrems and his NOYB organisation, The Irish Council for Civil Liberties , and  some Euro MEPs who are sceptical about everything, including  their own fellow sceptics. Recent Criticism. From the inception of the GDPR  it became clear that the role of the Irish DPC would be central in the overall enforcement of the Regulation. There was no way it could be other than central, having so many global tech companies head quartered here. Indeed, there is much anecdotal evidence of regulators in other jurisdictions not exactly wishing the office of the DPC the best in their GDPR enforcement endeavours. Unquestionably, some regulators in more populous  countries  felt slighted by the stronger role the DPC here was destined to play. This was shared by many MEPs with nationalistic fervour, as opposed to  European commitment . There was a similar sentiment expressed for years in relation to Ireland’s corporation tax regime. It may be that criticisms that gained media attention at the end of 2021 had their origins in similar nationalistic contexts. For example, several members of the European Parliament (MEPs) recently wrote to the EU authorities and Minister for Justice Helen McEntee accusing the Irish DPC of lobbying for lower standards for big tech. This was vehemently denied by the DPC. It seems to expert data protection consultants, GDPRXPERT, that these criticisms were outlandish. At the core is an allegation that the DPC was acting in bad faith and devoid of objectivity. This was particularly the case in relation to the criticism directed at the DPC concerning some of its interactions with Facebook. DPC response The DPC responded by stating, “ There has been considerable media coverage in recent days, alleging that the Data Protection Commission (DPC), acting in bad faith on foot of meetings it held with Facebook as part of its regulatory role, “lobbied” the European Data Protection Board (EDPB) with a view to achieving the adoption of guidelines by the EDPB on Article 6(1)(b) GDPR (‘necessity for the performance of a contract’), in the best interests of the company. These allegations are utterly untrue. Issuesrelating to the proper legal interpretation of the necessity for the performance of a contract are presently the subject of an ongoing regulatory procedure. That procedure is currently being conducted under  Article 60 of the GDPR. (Art. 60 GDPR sets out the scenarios for co-operation between the lead supervisory authority and other supervisory authorities concerned over the same issue.) More significantly and separately, Article 6(1)(b) is the subject of proceedings before the Court of Justice of the European Union.”   As referenced earlier, the objective in going to the CJEU is to get clarity on an issue where it may not be possible to get certainty. There is not always going to be ‘a one size fits all’ decision. From  a pragmatic perspective, often the best that can be expected is clarity, as opposed to certainty. Circumstances change from case to case and so much within the GDPR  has valid differing interpretations. Differing interpretations are consistent with a  regulation that has to be interpreted and applied in light of other competing rights. It has also  been alleged that the DPC approved/ negotiated/ jointly developed Facebook’s position in relation to the legal basis for its processing operations. “This is absolutely incorrect and without basis in fact. To be clear, the DPC does not and never has, endorsed, jointly developed, approved or in any other way assented or consented to a controller’s or processor’s policies or position in relation to compliance with its data protection obligations”( DPC statement, 7th December, 2021). Form of the criticism A central tenet of the criticism in relation to the DPC’s dealings with Facebook on the issue of contract as a lawful basis for processing is that the DPC sought to subvert the procedures of the EDPB with a view to achieving the adoption of guidelines by the EDPB on Article 6(1)(b), favourable to the interests of a particular controller. As a long established data protection advisory service,  would reject that  immediately. What can be accepted is that issues relating to the proper legal interpretation of the necessity for the performance of a contract are presently the subject of an ongoing regulatory procedure. The outcome of the procedures to which reference is made above will of course bind controllers and regulators alike, and may determine whether, when, and in what circumstances Article 6(1)(b) may be relied on by controllers as providing a legal basis for certain of their personal data processing operations. Some critics of the DPC seem unaware themselves of the process that precedes the issuance of any  guidelines from  the European Data Protection Board on the interpretation of any concepts inherent in the GDPR. Amongst other things, according to the DPC, the criticism  also “reveals a lack of any kind of basic understanding of the workings of the EDPB, and how, through an iterative process, divergent views relating to complex issues of principle are typically reconciled through dialogue, and through respectful and mature engagement”. ( DPC statement, 7 December, 2021) It is a common sense expectation that stakeholders’ compliance level will improve when they are clearer in their understanding of how the law is enforced. This is especially so when regulations such as the GDPR are based on some very broad principles, ( See Art.5 GDPR) rather than specifics, thus making regulations more open to  interpretation than, for example , road traffic legislation. No wonder then that the DPC is involved in so much discussion with other Supervisory Authorities in other EU countries, and other stakeholders, with the goal of increasing certainty and stability in how data protection law is to be applied. If  the DPC is doing this  in good faith, then can any criticism be justified? Increased certainty and stability is to the benefit of all stakeholders. This has been a consistent prong of attack for critics of the DPC but what is often ignored or denied is the complex nature of many of the issues involved. As with all EU regulations the CJEU is the final arbiter in the case of dispute, and the journey to that final point is long and arduous. Along the way, many  opinion writers  are guilty  of unprofessionalism in  simply repeating the same sources and quoting incorrect statistics. One of the most vocal critics of the DPC before and since the GDPR has been Max Schrems. Schrems  No:1 and No: 2 dragged on for years, but through no fault of the DPC. A closer look at these cases may  enable a clearer understanding of the legal obstacles to be overcome in order to avoid the procedural pitfalls embodied in much of the GDPR.  They also should  demonstrate the complicated nature of the legal and regulatory remit of the office of the DPC. Warranted criticism? The painstaking nature of the legal processes that must be gone through to make prudent adjudications on data protection law issues leads to unfair criticism being directed at the DPC. Such criticism usually takes the form of hastily made statements to the press citing inaction by the DPC. These statements are often perceived as facts by some journalists who lack  an  understanding of both  the depth of data protection  issues involved and the consequences of a misapplication of the facts. Criticism is often then repeated without any objective analysis. Some  analysis that is carried out is done by those least qualified to do it.  Again, data protection law advisers GDPRXPERT.IE  would reject such criticism as ill-informed at best and strategically devised at worst.  If one takes the High Court judicial review taken by FBI, , the judgment runs to 200 pages and  is deserving of more than a cursory perusal by some commentators who later claimed to be expertly knowledgeable. What was clear from their comments was that,  in all likelihood , they  had hurriedly  skimmed through a few pages. Schrems  seems to have  taken matters somewhat personally  and accused the DPC of failing to make a decision.  In fact, much of his criticism seems to take the form of personality based attacks rather than legal or principles based formats. He seems to never have forgotten that his original complaint  was dismissed on grounds of frivolity  by the DPC. This seemed a reasonable view at the time, and it was only in the aftermath of the full revelations by Edward Snowden that the scenarios took on a different texture.  However, what was lost on Schrems,  who is himself a lawyer, was that, as pointed out by Bermingham  J in O’N v McD  IEHC 135, “the words frivolous and vexatious are terms of the Article, they are legal terms and they are not used in a pejorative sense. They merely mean the plaintiff has no reasonable chance of succeeding, and that , because there is no reasonable chance of success, it is frivolous to bring the case”. Defensive position A position  taken by the DPC was that once an adequacy decision (here, the Safe Harbours Agreement) had been issued, the office had no part in investigating a complaint. This has always been the accepted view in relation to Commission decisions . For example, in Schrems No. 1  the CJEU stressed that while national authorities retained the ability to examine EU decisions, the CJEU alone retained the authority to declare an EU act (such as a Commission decision) invalid. It was clearly not within any legal remit of the DPC to act as a quasi court of last resort.  Safe Harbours itself stood as testament to the adequacy of the protection of transfers of personal data to the US.  Mr Justice Hogan in the High Court thought Schrems was objecting more ‘to the terms of the Safe Harbour regime itself’, than to the DPC’s application of it. (Schrems v DPC IEHC 310 (18 June 2014) Para.69).  Another position taken by the DPC was that the complaint ( the original) was essentially speculative and hypothetical in nature.  However, Mr. Justice Hogan took the view that there was no need to establish that the applicant had even grounds to suspect such a breach had occurred. It was enough to believe the mere absence of controls might lead to a breach of the applicant’s rights. If the matter was solely governed by Irish law significant issues would have arisen under the constitutional right to privacy. Mr Justice Hogan referred the case to the CJEU partly on the basis that, ‘in reality, on that key issue Irish law has been pre-empted by general EU law in the area…’  Facebook appealed this  referral to the CJEU but the Supreme Court did not find reason to block  it. The Court held it could not entertain an appeal over the fact of a referral itself. There had to be inconsistencies with the   ‘facts’ found by the High Court . The Court held  (through Clarke J.)  it  could only overturn those if it could  be established they were not sustainable in accordance with the relevant Irish jurisprudence. Having reached the CJEU ,the decision known as Schrems I, was finally made in Oct. 2015.  In that ruling, the CJEU quashed the Commission’s Decision, meaning that the US Safe Harbours could no longer be relied on as providing a legal basis for transfers of personal data to the US. It was in fact to enable a decision to be made that the DPC referred the case to the High Court in the first place. The idea was to get a decision  for once and for all from the  CJEU. This course of action has been assessed as rational, prudent and proper by EU Justice Commissioner  Didier Reynders. Indeed, the action was widely praised although some ( including some MEPs) did not agree but Commissioner Reynders was categorical in stating,  “the DPC faces “complex” matters, including in an issue over the targeting of ads by social media companies. Support for DPC The Irish regulator has supported the idea of allowing social media companies to target users with adverts without their consent, on the basis of rules governing the performance of a contract. Many other European national data regulators oppose this stance and some have criticised the DPC’s position. However, Mr Reynders reminded the MEPs that the issue of advert targeting as it pertains to Facebook has already been referred to the EU’s court of justice in the context of contract law, essentially backing the Irish regulator’s decision to weigh the issue carefully. Remember this;  at the very start Hogan J in the High Court had stated that the DPC had “demonstrated scrupulous steadfastness to the letter of the 1995 Directive and the 2000 Decision”. Commissioner Reynders  also backed the DPC by dismissing criticism that it is running late in its handling of 98 per cent of cross-border privacy cases: “The figure about the proportion of cases dealt by the Irish DPC mentioned in your letter appears to be a misinterpretation of the statistic.” Any criticism of the bona fides of the DPC regarding the original  Schrems case was, and is ,unjustified and cannot be legitimately upheld. Meanwhile, Facebook Inc. switched to “standard contractual clauses” to transfer EU data to the U.S., to which Schrems responded by updating his complaint with the DPC to include this new transfer mechanism which launched Schrems No:2. Although apparently not known by Mr Schrems at the time, FBI had identified three legal bases for ongoing transfers to the US. These were standard contractual clauses (SCCs), transfers with the consent of the data subject and transfers under the contractual necessity derogation in the then Directive.  In fact, it was the DPC that had  invited Schrems to reformulate his complaint in light of the judgment in Schrems 1 and in light of the fact that Safe Harbours had been found to be invalid. On Dec.1  2015 Schrems submitted a reformulated complaint using the validity of the standard contractual clauses as the prong of attack. End in sight In May 2016, the DPC issued a draft decision stating that the DPC had formed the view on a “preliminary basis” that Max Schrems’s contention that the SCCs could not be relied on was well founded. However, in the DPC’s view, questions as to the validity of the SCCs could only be determined by the CJEU, not by the DPC, or by national courts. The DPC therefore immediately commenced further proceedings in the Irish High Court seeking a reference to the CJEU. Following an unsuccessful appeal by Facebook Ireland Ltd. (FBI) against the High Court’s decision to refer a range of questions to the CJEU, these proceedings   ultimately led  to the CJEU’s Schrems II ruling in July 2020. It is worth noting that in the meantime the European Commission had adopted a Decision that the Privacy Shield, as a replacement for the Safe Harbor, now ensured an adequate level of protection for personal data transferred from the EU to the US. Furthermore the GDPR had replaced the former Data Protection Directive, coming into force in May 2018. The Schrems II ruling established that, although the SCCs remained valid, a data exporter in the EU making use of them is nevertheless required to verify, on a case by case basis, and taking into account their terms, whether the law and practice in the destination country ensures essentially equivalent protection for any transferred data . At particular issue was the ability of public authorities in the destination country to conduct surveillance on the transferred data.  The CJEU had specially concluded that EU citizens had no effective way to challenge American government surveillance of their personal data after it had been sent to the U.S.  Such surveillance was legal under U.S. law. If the data exporter is not, as far as is necessary, able to put in place sufficient supplementary measures to guarantee essentially equivalent protection, the data exporter, or, failing that, the relevant data protection authority, is required to suspend or end the transfers. In the ruling, the CJEU also went on to quash the Commission’s Decision on the Privacy Shield. Getting closer In August 2020 , the month following the CJEU’s ruling in Schrems 11,  the DPC wrote to FBI enclosing the PDD that was subsequently the subject of the FBI’s judicial review application. This gave FBI 21 days to respond and stated that the DPC was now undertaking an “own-volition” inquiry into FBI’s data transfers after which it would return to Max Schrems’ original, reformulated complaint. At that stage the situation was that if the Preliminary Draft Decision of the DPC  was translated into a final decision , then Facebook would be required to suspend its data transfers to the US.  However Max Schrems appears to have taken exception to his apparent exclusion from proceedings and submitted his own application to the Irish High Court for judicial review of the DPC’s approach. Settlement was subsequently reached between the DPC and Max Schrems on this judicial review application in which the DPC agreed, upon the Court’s lifting of the stay of its investigation, to progress the handling of Max Schrems complaint and its “own-volition” inquiry as expeditiously as possible. FBI took exception to the issuing of the PDD on several grounds relating to unfairness including procedural unfairness and instigated judicial review proceedings against the DPC with a consequential stay on the DPC’s “own-volition” inquiry. The case was heard by the Irish High Court in December. What we now know We now know that on 14  May 2021  the Irish High Court handed down its judgment in the judicial review case brought by Facebook Ireland Ltd (FBI)against the DPC, finding substantially in favour of the DPC. Although not entirely uncritical of the DPC, the judgment accepts the validity of the approach adopted by the DPC in its investigation of FBI’s data transfers. The Court did agree with FBI that the issuing of the PDD and the surrounding procedures were open to judicial review and therefore went on to consider, in some depth, each of the grounds of challenge advanced by FBI. In the course of proceedings, FBI dropped two of these grounds. The remaining grounds were all rejected by the Court, the overall conclusion being that FBI had not established any basis for calling into question the validity of the DPC’s processes. It is reported that on 20 May and with consent of the parties, the Irish High Court formally lifted the stay on the DPC’s “own-volition” inquiry. FBI  still had the opportunity at that time  to respond to this PDD but, unless it could  satisfy the DPC as to the safeguards in place for its international transfers to the US, it seems likely that, following the application of the GDPR’s cooperation and consistency mechanism, FBI would  be ordered to suspend these transfers. Judgment time The High Court judgment when it came was lengthy and detailed, running to nearly 200 pages. For the most part, it addressed procedural points which, given that that the findings went against FBI, are unlikely to be particularly instructive for other businesses. The picture is also made more complex by the involvement of Max Schrems himself as a participant in the hearing and by his own application for judicial review against the DPC. This application was settled between the date of the High Court hearing and the date of the delivery of its judgment and is referred to in the judgment. There is thus little to be gained from an in depth analysis of all aspects of the judgment. It might nevertheless be of value to recap just where we are now, and how we have arrived there, in the long running saga of Max Schrems and his challenges to FBI’s international data transfers. Some high level insights can also be drawn about the conduct of major investigations by data protection authorities which might be instructive. Finally, there remains an open question as to where this now leaves other businesses that are continuing to transfer personal data to the US on the basis of the European Commission’s Standard Contractual Clauses (SCCs). It was  clear from the judgment that the DPC’s preliminary view, as set out in its PDD, was that; US law did not provide a level of protection that is essentially equivalent to that provided by EU law; SCCs cannot compensate for the inadequate protection provided by US law; FBI did not appear to have in place any supplemental measures which would compensate for the inadequate protection provided by US law. More support for DPC The High Court judgment was  undoubtedly welcome news for the embattled Irish Data Protection Commissioner, Helen Dixon. She had, and continues to, come under fire from many sides, including the European Parliament’s LIBE Committee, for what is perceived to be a reluctance to take sufficiently strong enforcement action against major tech companies that have their European headquarters in Ireland and for her office’s long processing times. The LIBE Committee even expressed disappointment with her decision to initiate the Schrems II case rather than triggering enforcement action against FBI. Furthermore, the Committee  has called on the European Commission to launch infringement proceedings against Ireland for a failure to enforce the GDPR effectively. Against this background, the judicial review case makes clear that DPC was right to have proceeded cautiously. When faced with enforcement action that seeks to significantly restrict their business models or when faced with multi-million euro fines businesses will understandably look for legitimate avenues to challenge the actions of data protection authorities, whether through more conventional appeals against sanctions or by means of judicial review. Any data protection authority needs to have a defensible position that it can put before the courts when challenged. The DPC has survived an examination by the Irish High Court and there can be no denying that it was a comprehensive and searching examination. Had the DPC been found to have jumped to conclusions without a thorough investigation, not to have been offering FBI a proper opportunity to state its case, otherwise followed procedures that were unfair to any of the parties involved or had not been sufficiently transparent about those procedures, it would almost certainly have come a cropper. Ensuring the necessary procedural fairness requires time and effort by a data protection authority whatever the political pressures on it might be. At the time there was a concerted and shallow choreography of criticism directed at the DPC. The High Court did  recognise that there has to be some flexibility. A data protection authority can legitimately be expected to continue a well-established practice of following a particular procedure but, provided that it stays within the law, it does not have to do so rigidly. It can adapt its approach to the circumstances of particular cases. It is just that any procedural variation by the data protection authority has to be based on objective reasons and must not create unfairness or be unjust to the party under investigation. Nothing was written in stone. An earlier  annual report, detailing inquiry  procedures that Facebook sought to rely on, did state  ( at p.28) things were “subject to changes”. ( See DPC Annual Report 2018) Rebuke for DPC The DPC did not entirely escape criticism though. The High Court judge, whist finding in favour of the DPC in relation to an allegation of premature judgment, suggested that it might have been wiser for the Commissioner, Helen Dixon, to have been more circumspect in remarks she made in a conference address to the effect that the Schrems II ruling by the CJEU had given her no room for manoeuvre in relation to EU-US data transfers. Again, whilst finding in favour of the DPC in relation to an allegation of a failure to respect the duty of candour, the judge expressed some misgivings about the DPC’s failure to respond more fully to requests for information from FBI and suggested that it had acted in an overly defensive manner. The Judge was actually at his most critical in relation to an allegation by the DPC that FBI’s issuing of its proceedings amounted to an abuse of process and had been done for an improper purpose, that of buying time. Here the Judge said that this was a serious allegation, that there was no basis for it and that it ought never to have been made.   Data protection commissioners have a difficult path to steer. On the one hand they operate in an increasingly political environment and are expected to be champions of privacy and of data subject rights. On the other hand, when considering sanctions, they carry out quasi-judicial functions and have to act, and be seen to act fairly and without bias. The High Court judgment confirms that Helen Dixon has managed to keep to the straight and narrow so far in the case in question but the same might not have been true had she conceded more ground to her critics. What is clear though is the extent to which commissioners, when acting in their quasi-judicial capacity, can now be held accountable to the courts, and the extent to which affected businesses may be willing to exercise their rights to give effect to this accountability. As the UK Commissioner, Elizabeth Denham was also reminded of when seeking to defend the ICO’s imposition of a fine on Facebook in the wake of the Cambridge Analytica scandal, commissioners need to be very careful not to risk giving any appearance of rushing to premature judgment, to stick to their published procedures unless there are objective and fair reasons for departing from these and not to otherwise risk bringing unfairness or injustice into their deliberations whatever the wider pressures on them might be. Back to the SCCs It was the question of supplemental measures that  attracted most interest from other businesses. Here it needs to be borne in mind that Facebook Inc in the US qualifies as an electronic communications service provider and can therefore be ordered to make transferred data about specified non-US persons in its stored communications directly available to US public authorities. It is not just liable to have its communications to and from the EU intercepted in transit by such authorities. Although, in an effort to be helpful, the EDPB had produced recommendations on supplemental measures that could be adopted to enhance the SCCs, there remained  a question in relation to EU-US transfers as to how to sufficiently compensate for the inadequate protection provided by US law in practice. We now know that the DPC went on to prepare a full draft decision and submitted it via the co-operation and consistency mechanism. The DPC had simultaneously  been working on an inquiry into Facebook  Ireland( now Meta Platforms) concerning a series of data breaches between 7 June 2018 and 4 December 2018.  The inquiry examined the extent to which Meta Platforms complied with the requirements of GDPR Articles 5(1)(f), 5(2), 24(1) and 32(1) in relation to the processing of personal data relevant to the twelve breach notifications. As a result of its inquiry, the DPC found that Meta Platforms infringed Articles 5(2) and 24(1) GDPR.  The DPC found that Meta Platforms failed to have in place appropriate technical and organisational measures which would enable it to readily demonstrate the security measures that it implemented in practice to protect EU users’ data, in the context of the twelve personal data breaches. Final destination in sight? Given that the processing under examination constituted “cross-border” processing, the DPC’s decision was subject to the co-decision-making process outlined in Article 60 GDPR and all of the other European supervisory authorities were engaged as co-decision-makers.  While objections to the DPC’s draft decision were raised by two of the European supervisory authorities, consensus was achieved through further engagement between the DPC and the supervisory authorities concerned.  Accordingly, the DPC’s decision represents the collective views of both the DPC and its counterpart supervisory authorities throughout the EU. On 15 March 2022 the DPC imposed a fine of €17 million on FBI( Meta Platforms). To any fair minded neutral observer, any criticism of the DPC on the basis of inactivity is certainly  unsustainable. Remember, in the content of the FBI/Schrems saga the DPC had to prepare its draft decision and submit this to the cooperation and consistency mechanism, which ultimately  involved the need for an EDPB opinion. This process seldom results in a quick outcome, despite the time limits in the GDPR.Because of a sort of  stalemate on the issue going back to February this year, there were movements by some national supervisory authorities to take a  stand on the case. Some  adopted  a literal interpretation of the ruling. The French privacy regulator CNIL ruled that an unnamed website could not use Google Analytics because doing so involves the transfer of personal information from Europe to the U.S. in violation of the 2020 Schrems II decision. The French decision came  hot on the heels of a decision by Austria’s data protection authority to also ban a website from using the popular Google web analytics tool for the same reason, and presages a raft of decisions by other European data protection authorities on the use of these tools. The Dutch privacy agency warned last December  that using Google Analytics may soon be illegal. Elsewhere, the Norwegian data watchdog has advised companies to start looking for alternatives to Google’s tools. Almost there! Data protection authorities, including the CNIL, are also expected to rule soon on the use of Facebook’s analytics tool, known as Facebook Connect. These decisions mark a significant clamp-down on data transfers, which form the lifeblood of the digital economy and represent billions of euros’ worth of transatlantic trade.   GDPR and data protection advisory services, such as, have had  large volume of enquiries from businesses regarding the future of transfers to the US. Much from the preceding paragraphs has been reported through Once the landmark decision began to bite, regulators across the bloc were left with few  alternatives or choices in adhering to the new rules. That began to prompt companies like Google, Microsoft and TikTok to consider the once unthinkable: storing ever more data in Europe. The potential negative effects of such moves may also have spurred the DPC to continue efforts to resolve the issue.  After the 2021 High Court ruling against Facebook the DPC was able to continue efforts to bring a conclusion to the protracted affair. This meant publishing a full draft decision and taking it all through the cooperation and consistency mechanism under Art. 60 GDPR in order to set out a final decision. This is exactly what the DPC did. Throughout all of this, proper procedures were followed. Finally, the stage was reached where it was imperative the Commission  reached a decision on transfers. Some measure of substantive adjustment to existing Standard Contractual Clauses, or an entirely new mechanism, was needed to ensure uninterrupted data flows to the US.  On 22nd March 2022 the European Commission and the Biden administration reached an agreement in principle, the Trans-Atlantic Data Privacy Framework Agreement. While the agreement is still “in principle” and specific details have yet to be determined, if approved, this agreement will reimplement an important legal mechanism necessary to facilitate data transfers between the European Union and the United States. Some have urged caution, “From a purely technical perspective, there’s no path forward for data transfers. That’s why we need durable EU-U.S. data pact that can stand the test in court,” said Rob van Eijk, Europe managing director for the Future of Privacy Forum think tank. More still to come Very soon we will return to the issue to report on the evolving  position on transfers to the US. We also note the DPC has attempted to clear the air on the criticisms directed at it and has issued a report on cross border complaints where it sets out the actual statistics, instead of some alternative ones, that to an objective observer were  clearly  distorted, biased and misleading. See   The actual report is here.   Here at we are GDPR and data protection law experts  offering businesses our  vast expertise in addressing compliance issues. are located in Carlow/Kilkenny and Mayo, offering a  nationwide service. Call 0858754526 or 0599134259 to discuss your particular need. Patrick Rowland, [...]
11/11/2021Latest NewsThere are very few organisations that  at some stage in the business relationship will not encounter some form of personal data breach and the data controller will have to respond .  Preparing for, and anticipating a  breach, are  the proactive parts. Encountering an actual  breach is only the start.  An active response must be diligent and prudent. This response must include  an integral and strategic risk assessment, leading to  mitigation of those risks in a timely manner. GDPR and data protection consultants,, have advised extensively on data breaches.   Art.12 GDPR defines “a personal data breach as a breach of security leading to the accidental or  unlawful  destruction , loss, alteration, unauthorized disclosure of, or access to, personal data transmitted, stored  or otherwise processed.”  Recital 85, GDPR,  warns that the breach may, if not addressed in an appropriate and  timely fashion, result in physical, material or non-material damage. Examples abound but include : financial loss (high risk); identity theft (high risk); damage to reputation (high risk); loss of confidentiality of personal data protected by professional secrecy; Fraud and other economic or social disadvantages. There are three broad types of breaches first outlined by the Art. 29 Working Party. The Working Party was a European Union data protection  advisory group, prior to the European Data Protection Board (EDPB) and the GDPR. The ‘confidentiality breach‘ where personal data are disclosed or accessed in an unauthorised or unlawful manner; The’integrity breach‘ where personal data are altered in an unauthorised or unlawful manner; The ‘availability breach‘where personal data are lost or destroyed in an unauthorised or accidental manner. Looking at something like “accidental or unlawful destruction” of personal data will lead one to discover that here the data no longer exist . If the data are in existence at all, they no longer exist in a form accessible to the data controller. This is consistent with an ‘availability breach’. In a  “loss’” scenario the data controller lacks control of , or no longer has access to, the data or the data. Think of ransomware with data encrypted, or the loss of an encryption  key. The personal data are no longer in the possession of the controller at all and so there is an ‘availability breach’. With “alteration” of data the integrity of the data has been compromised, hence an ‘integrity’ breach. “Unauthorised disclosure of, or access to, personal data” is most commonly seen where data are disclosed to recipients not authorised to receive such data and the result is a clear ‘confidentiality’ breach. Whatever the type or form of breach, action is required in a timely fashion. Under the GDPR two primary obligations are placed upon the controller; (a)Notification of any personal data breach to the DPC, unless the data controller can demonstrate the breach is unlikely to result in a risk to data subjects; and (b) communication of that breach to data subjects, where the breach is likely to result in high risk to  data subjects., acting as outsourced DPO ,have conducted  many data breach analyses . This experience leads to the solid conclusion that very few breaches  are the same. However, one aspect that does not change is the breach notification procedure, as this is clearly set out in GDPR and Data Protection Act 2018. First of all, let’s look at the procedure where the breach is to be notified to the DPC. Data Breach Notification to the DPC The controller is compelled to notify the DPC of a personal data breach unless the breach is unlikely to result in  a risk to the rights and freedoms of a natural person. GDPR and data protection consultants can attest to the tricky subjective nature of the  assessment  of what is ‘unlikely’. Should the decision be that a risk is not unlikely, then the controller has to notify the DPC. Notification to the DPC has to take place without undue delay and where feasible, no later than 72 hours once the controller has become aware of the breach. In a situation where the DPC is not notified within the 72 hour time frame, reasons for any delay must  be given. Accountability requirements under Art.5 (2) GDPR will kick in, meaning that the controller in the context of a data breach will have to demonstrate compliance  with the other principles of data processing including Art. 5 (1) (f), ‘integrity and confidentiality’, i.e., “been processed in a manner that ensures appropriate security of personal data”… On top of this, under Art.33(5) ,controllers must under Art.33(5) document all information relating to the breach so that the DPC can have evidence of their compliance with the notification obligations under Art.33. A controller should be regarded as having become ‘aware’ of the breach when they have a reasonable degree of certainty that a security incident has occurred and compromised personal data.  Don’t forget, that in order to comply with their obligations under the Article 5(2) (principle of accountability) , as well as the requirement to record relevant information under Article 33(5), controllers should be able to demonstrate to the DPC when and how they became aware of a personal data breach. Controllers, as part of their internal breach procedures, should  have a system in place for recording how and when they become aware of a breach. Allied to this is the necessity for being able to show their methodology in assessing the potential risks posed by the breach. Controllers need to show a coherent  methodology to  explain their decision making. N.B. It does not mean the DPC will accept it as being sound or reasonable, but the controller will , in all likelihood be seen to have at least been  acting in good faith. The default position for controllers is that all data breaches should be notified to the DPC, except for those where the controller has assessed the breach as being unlikely to present any risk to data subjects. The controller MUST show why they reached this conclusion.   Documentation should include the details of how the controller assessed the likelihood of risk and severity of risk to the rights and freedoms of the data subject. In all situations of recognised breaches , even  ones that do not require notification to the DPC, the legal onus is always  on the controller to record at least the basic details of the breach, the assessment thereof, its effects, and the steps taken in response, as required by Article 33(5) GDPR. (This is often forgotten by controllers and missed by many more. Be careful!!!) To state the patently obvious ;  to know whether or not  the breach is one that should be notified to the DPC, the controller must first be aware of the data breach itself.  Once aware of the breach, the clock is ticking. As we just touched on, before deciding on  whether there is a  need to notify the DPC concerning a breach,  the controller must make an adequate assessment of the risks posed by the data breach. This is not an exact science,  but more a judgement process. In this process there are some factors and particular aspects that demand scrutiny. The assessment has to be set in the knowledge that there are  risks  that impact negatively  not just  on the right to data protection and privacy , but  often many other rights such as free speech and freedom of movement. Factors that controllers should take into account when engaging in such an assessment include, but are not limited to: the type and nature of the personal data (including whether it contains sensitive, or ‘special category’ personal data); the circumstances of the personal data breach; whether or not personal data had been protected by appropriate technical protection measures, such as encryption or pseudonymisation; the ease of direct or indirect identification of the affected data subjects; the likelihood of reversal of pseudonymisation or loss of confidentiality; the likelihood of identity fraud, financial loss, or other forms of misuse of the personal data ; whether the personal data could be, or are likely to be, used maliciously; the likelihood that the breach could result in, and the severity of, physical, material or non-material damage to data subjects; and whether the breach could result in discrimination, damage to reputation or harm to data subjects’ other fundamental rights. Once the controller has made the risk assessment and concludes there is a need to notify the DPC, the notification must at least: describe the nature of the personal data breach, including, where possible, the categories and approximate number of data subjects concerned and the categories and approximate number of personal data records concerned; communicate the name and contact details of the data protection officer (DPO) or other contact point where more information can be obtained; describe the likely consequences of the personal data breach and;  describe the measures taken or proposed to be taken by the controller to address the personal data breach, including, where appropriate, measures to mitigate its possible adverse effects. To assist the DPC in assessing compliance with the requirement to notify ‘without undue delay’, as well as the principle of accountability, the DPC recommends that controllers include, in their initial notification, information on how and when they become aware of the personal data breach, along with an explanation for any delay, if applicable. Where, and in so far as , it is not possible to give all the foregoing information at the same time , the information may be provided in phases without undue further delay.   Data Breach Notification to Data Subjects As referenced earlier, there is also an obligation placed on controllers to notify the data subject of a data breach: “where that personal data breach is likely to result in a high risk to the rights and freedoms of  natural persons” ( Art.34 (1) GDPR). Where the risk is immediate and needs to be mitigated  prompt action is required in communicating with the data subject  (See Recital 86). The need to implement appropriate measures against continuing or similar personal data breaches may justify more time ( Recital 86). Where there is a need a need for notification to the data subject Art. 34(2) mandates the communication must describe in clear and plain language the nature of the  personal data breach and contain at least ( i.e. at a minimum) the information contained in points (b) (c) and (d) of  Art. 33 (3). Where a controller has not notified the data subject, the supervisory authority, having considered the likelihood of a high risk resulting from the breach, may either require the controller to communicate a breach or decide that any of  the conditions (a) (b) or  (c) of Art.34(3) outlined below have been met. The controller has implemented appropriate technical and organisational protection measures, such as rendering the data unintelligible to any person not authorised to access it, e.g. encryption; The controller has taken subsequent measures so that the high risk is no longer likely to materialise and ; It would involve disproportionate effort to communicate directly to every data subject. Here a public communication suffices. In a case where the controller deems it necessary to communicate  the breach to the data subject , the controller will also be communicating it to the DPC. This is on the logical basis that if it is ‘likely to result in a high risk to the rights and freedoms of natural persons’, (must  notify Data Subject)  by implication,  the same breach cannot be ‘unlikely to result in a risk to rights and freedoms’ ( also  notify DPC). If it is likely to pose a high risk then  it can hardly be unlikely to pose a risk, which is lower than a high risk. Any  ‘risk’, or ‘risk simpliciter’, as some like to call it, must be of a type that is lower than a ‘high risk’. There is clearly a higher threshold  for notification to the data subject. Whilst there is no obligation on controllers to communicate a personal data breach to affected data subjects where it is not likely to result in a high risk to them, controllers are nevertheless free to communicate a breach to data subjects where it may still be in their interests or appropriate to do so anyway, in the context of that particular breach. While the notification should be made to the data subject as soon  as reasonably feasible,  sometimes it may be advisable  to delay  notification. A common example is where a controller is made aware a criminal investigation may be pending and early notification may prejudice such an investigation.  In this scenario ,a delay on the advice of law enforcement authorities would be justifiable. Once it becomes clear that any notification is no longer prejudicial to an investigation ,  the data subject should be promptly informed. We have seen  earlier that once the breach has been detected and  the risks assessed the controller may be obliged to notify the DPC and the data subject. This depends most of all on  the conclusion reached after the risk assessment. We also looked earlier  at some factors to be taken into account when conducting the risk assessment. The  risk assessment has to be an objective assessment. It must judge the severity and likelihood of the risks. As part of the ePrivacy Directive , the EU Agency for Network and Information Security (ENISA) produced recommendations for a data  breach severity  assessment. Within this, the severity of three different factors  is to be considered. Assessing the severity of the risk. Factor A: The type of data that was breached can have a value of 1-4; Factor Y: The ease with which a data subject can be identified is assigned a value of 1 or lower ; Factor Z: The specific circumstances of the breach  can have a value of 0.5 or lower. No assessment modality has yet been adopted for  the GDPR, but this method is a useful guide to help quantify Risk Severity. Despite all this, any  risk assessment  remain more of an art than a science. (Consequently, a key element of any data security policy is being able, where possible, to prevent a breach and, where it nevertheless occurs, to react to it in a timely manner).   Record Keeping Obligations. Regardless of whether or not a breach needs to be notified to the supervisory authority, the controller must keep documentation of all breaches, as Article 33(5) explains: “The controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken. That documentation shall enable the supervisory authority to verify compliance with this Article.” Therefore,  controllers must bear in mind the onus is on them to ensure that they continue to document how any personal data breaches that arise are dealt with. This is linked to the accountability principle of the GDPR, contained in Article 5(2). The purpose of recording non-notifiable breaches, as well notifiable breaches, also relates to the controller’s obligations under Article 24, and the supervisory authority can request to see these records. Controllers are therefore encouraged to establish an internal register of breaches, regardless of whether they are required to notify or not. Whilst it is up to the controller to determine what method and structure to use when documenting a breach, in terms of recordable information there are key elements that should be included in all cases. As is required by Article 33(5), the controller needs to record details concerning the breach, which should include its causes, what took place and the personal data affected. It should also include the effects and consequences of the breach, along with the remedial action taken by the controller. The old Art. 29 WP guidelines recommend that the controller also document its reasoning for the decisions taken in response to a breach. In particular, if a breach is not notified, a justification for that decision should be documented. This should include reasons why the controller considers the breach is unlikely to result in a risk to the rights and freedoms of individuals . Alternatively, if the controller considers that any of the conditions in Article 34(3) are met, then it should be able to provide appropriate evidence that this is the case. (The conditions under Art. 34(3) are those that make notification to data subjects unnecessary, as seen above earlier) Where the controller does notify a breach to the supervisory authority, but the notification is delayed, the controller must be able to provide reasons for that delay; documentation relating to this could help to demonstrate that the delay in reporting is justified and not excessive. Where the controller communicates a breach to the affected individuals, it should be transparent about the breach and communicate in an effective and timely manner. Accordingly, it would help the controller to demonstrate accountability and compliance by retaining evidence of such communication. To aid compliance with Articles 33 and 34, it would be advantageous to both controllers and processors to have a documented notification procedure in place, setting out the process to follow once a breach has been detected, including how to contain, manage and recover the incident, as well as assessing risk, and notifying the breach.   In this regard, to show compliance with GDPR it might also be useful to demonstrate that employees have been informed about the existence of such procedures and mechanisms and that they know how to react to breaches. It should be noted that failure to properly document a breach can lead to the supervisory authority exercising its powers under Article 58 and, or imposing an administrative fine in accordance with Article 83. Much of the foregoing information is also available on this link at the DPC website. The DPC has also recently updated a data breach notification form. At the same site you will find useful tips on avoiding data breaches.   Here at we are  GDPR and data protection consultants with vast expertise helping businesses first fully recognise, and then properly react to , data breaches . are located in Carlow/Kilkenny and Mayo, offering a  nationwide service. Call 0858754526 or 0599134259 to discuss your particular need. Patrick Rowland, [...]
02/09/2021Latest NewsA Data Protection Impact Assessment ( DPIA) is one  of the most responsible tasks that, in certain circumstances,  is prescribed under the GDPR. Non compliance with DPIA requirements can lead to the imposition of fines by the DPC. Any reputable data protection consultancy should have qualified, certified and experienced data protection professionals available to carry out DPIAs on your behalf. At we routinely undertake DPIAs as part of our services. This service is available nationwide. Data protection consultants have found that even  in the cases where a DPIA is not mandatory, it is always an advisable course of action. What is a data protection impact assessment? A Data Protection Impact Assessment is a process specifically designed to identify, quantify and mitigate the risks involved in the processing operation. It does this primarily by assessing the necessity and proportionality of the processing and putting a strong emphasis on managing the risks to the rights and freedoms of all natural persons resulting from the processing of personal data. Therefore, an essential ingredient in any DPIA mix is a measured assessment of the risks to those rights and freedoms, and a determination of the appropriate measures to address them. At the heart of the DPIA is its role as a conduit of accountability that works to enable controllers to comply with their requirements under GDPR. By using this accountability tool a controller can demonstrate that all appropriate measures have been taken to ensure compliance with the Regulation. In essence, the DPIA is the building block to construct and demonstrate compliance. Data protection consultants will provide the foundation for you to build and construct a compliant business  structure. DPIA Content Article 29 Working Party elaborates on the details. The GDPR does not formally define the concept of a DPIA as such, but – its minimal content is specified by Article 35(7) as follows: “(a) a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller; (b) an assessment of the necessity and proportionality of the processing operations in relation to the purposes; (c) an assessment of the risks to the rights and freedoms of data subjects ; and (d) the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data, and to demonstrate compliance with this Regulation, taking into account the rights and legitimate interests of data subjects and other persons concerned”. Recital 84 goes on to clarify the role in the following terms; “In order to enhance compliance with this Regulation where processing activities are likely to result in a high risk to the rights and freedoms of natural persons, the controller should be responsible for the carrying-out of a data protection impact assessment to evaluate in particular, the origin, nature, particularity and severity of that risk”. The same Recital continues; “The outcome of the assessment should be taken into account when determining the appropriate measures to be taken in order to demonstrate that the processing of personal data complies with this Regulation”.   Is a Data Protection Impact Assessment Mandatory?   A DPIA is not mandatory for every personal data processing operation. Indeed, the risk-based approach inherent in the GDPR requires only that a DPIA be carried out when the processing is “likely to result in a high risk to the rights and freedoms of natural persons” (Article 35(1)). There is no necessity for a certainty, but inherently high risk should attract more scrutiny. Article 35 (3) states a DPIA “shall in particular be required in the case of: “(a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person ; (b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10 ; or (c) a systematic monitoring of a publicly accessible area on a large scale”. The words above, ‘in particular’ (in bold), from Art.35 (3), signify that the list is deliberately non-exhaustive. One practical consequence is that there will be cases that do not fall neatly into any ‘high risk’ category, but yet they pose a quantifiably high risk.  To make the assessment on whether a DPIA is mandatory or not, in itself, involves a risk assessment or a sort of mini DPIA. What is ‘likely to result in high risks…?’ How is the ‘high risk’ to be assessed? Recital 84 places emphasis on evaluating the ‘origin, nature, particularity and severity of the risk.’ A general backdrop to the high risk potential includes aspects such as the nature, the context, the scope and the purposes of the processing. Prudent advice from the Art. 29 WP Guidelines is that where it is not clear whether a DPIA is required, a DPIA should nonetheless be carried out to help data controllers comply with data protection law. Some Other Criteria for A DPIA. There is then, what might be called, ‘an assessment before an assessment’. Art. 35(4) envisages the establishment of a list of processing operations that would guide controllers in their scrutiny of operations that may require a DPIA. Art. 29 WP lays out the relevant criteria to be considered in this regard: Evaluation or scoring, including profiling and predicting, especially from “aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements” (Recitals 71 and 91). Examples of this could include a bank that screens its customers against a credit reference database, or a biotechnology company offering genetic tests directly to consumers in order to assess and predict the disease/health risks, or a company building behavioural or marketing profiles based on usage or navigation on its website; Automated-decision making with legal or similar significant effect: processing that aims at taking decisions on data subjects producing “legal effects concerning the natural person” or which “similarly significantly affects the natural person” (Article 35(3)(a)). For example, the processing may lead to the exclusion or discrimination against individuals. Processing with little or no effect on individuals does not match this specific criterion; Systematic monitoring: processing used to observe, monitor or control data subjects, including data collected through “a systematic monitoring of a publicly accessible area” (Article 35(3) (c)). This type of monitoring is a criterion because the personal data may be collected in circumstances where data subjects may not be aware of who is collecting their data and how they will be used. Additionally, it may be impossible for individuals to avoid being subject to such processing in frequent public (or publicly accessible) space(s); Sensitive data: this includes special categories of data as defined in Article 9 (for example information about individuals’ political opinions), as well as personal data relating to criminal convictions or offences. An example would be a general hospital keeping patients’ medical records or a private investigator keeping offenders’ details. This criterion also includes data which may more generally be considered as increasing the possible risk to the rights and freedoms of individuals, such as electronic communication data, location data, financial data (that might be used for payment fraud). In this regard, whether the data has already been made publicly available by the data subject or by third parties may be relevant. Where  personal data is publicly available, this aspect   may be considered as a factor in the assessment if the data was expected to be further used for certain purposes. This criterion may also include information processed by a natural person in the course of purely personal or household activity (such as cloud computing services for personal document management, email services, diaries, e-readers equipped with note-taking features, and various life-logging applications that may contain very personal information), whose disclosure or processing for any other purpose than household activities can be perceived as very intrusive; Data processed on a large scale: the GDPR does not define what constitutes large-scale, though Recital 91 provides some guidance. In any event, the WP29 recommends that the following factors, in particular, be considered when determining whether the processing is carried out on a large scale: (a)the number of data subjects concerned, either as a specific number or as a proportion of the relevant population; (b)the volume of data and/or the range of different data items being processed; (c)the duration, or permanence, of the data processing activity; (d)the geographical extent of the processing activity. 6.Datasets that have been matched or combined, for example originating from two or more data processing operations performed for different purposes and/or by different data controllers in a way that would exceed the reasonable expectations of the data subject.; (7)Data concerning vulnerable data subjects (Recital 75): the processing of this type of data can require a DPIA because of the increased power imbalance between the data subject and the data controller, meaning the individual may be unable to consent to, or oppose, the processing of his or her data. For example, employees would often meet serious difficulties to oppose to the processing performed by their employer, when it is linked to human resources management. Similarly, children can be considered as not able to knowingly and thoughtfully oppose or consent to the processing of their data. This also concerns more vulnerable segment of the population requiring special protection, such as, for example, the mentally ill, asylum seekers, or the elderly, a patient, or in any case, where an imbalance in the relationship between the position of the data subject and the controller can be identified; (8)Innovative use or applying technological or organisational solutions, like combining use of finger print and face recognition for improved physical access control, etc. The GDPR makes it clear (Article 35(1) and Recitals 89 and 91) that the use of a new technology can trigger the need to carry out a DPIA. This is because the use of such technology can involve novel forms of data collection and usage, possibly with a high risk to individuals’ rights and freedoms. Indeed, the personal and social consequences of the deployment of a new technology may be unknown. A DPIA will help the data controller to understand and to treat such risks. For example, certain “Internet of Things” applications could have a significant impact on individuals’ daily lives and privacy; and therefore require a DPIA; (9)Data transfer across borders outside the European Union (Recital 116), taking into consideration, amongst others, the envisaged country or countries of destination, the possibility of further transfers or the likelihood of transfers based on derogations for specific situations set forth by the GDPR; ( put in link to my article ) (10)When the processing in itself “prevents data subjects from exercising a right or using a service or a contract” (Article 22 and Recital 91). This includes processing performed in a public area that people passing by cannot avoid, or processing that aims at allowing, modifying or refusing data subjects’ access to a service or entry into a contract. An example of this is where a bank screens its customers against a credit reference database in order to decide whether to offer them a loan. The WP29 considers that the more criteria are met by the processing, the more likely it is to present a high risk to the rights and freedoms of data subjects, and therefore to require a DPIA. As a rule of thumb, a processing operation meeting less than two criteria may not require a DPIA due to the lower level of risk, and processing operations which meet at least two of these criteria will require a DPIA. However, in some cases, a processing meeting only one of these criteria will require a DPIA. Conversely, if the controller believes that despite the fact that the processing meets at least two criteria, it is considered not to be “likely high risk”, he has to thoroughly document the reasons for not carrying out a DPIA. In addition, a data controller subject to the obligation to carry out the DPIA “shall maintain a record of processing activities under its responsibility”( Art. 30 (1), including, inter alia, the purposes of processing, a description of the categories of data and recipients of the data and “where possible, a general description of the technical and organisational security measures referred to in Article 32(1)”, and must assess whether a high risk is likely, even if they ultimately decide not to carry out a DPIA. Note: supervisory authorities are required to establish, make public and communicate a list of the processing operations that require a DPIA to the European Data Protection Board (EDPB) (Article 35(4)). The criteria set out above can help supervisory authorities to constitute such a list, potentially with more specific content added in time if appropriate. For example, the processing of any type of biometric data or that of children could also be considered as relevant for the development of a list pursuant to Article 35(4). The DPC has issued guidelines on processing operations that require a DPIA. Where a documented screening or preliminary risk assessment indicates the processing operation is likely to result in a high risk to the rights and freedoms of individuals pursuant to Art.35 (1) the DPC has determined a DPIA will also be mandatory for the following types of processing operations: 1) Use of personal data on a large-scale for a purpose(s) other than that for which it was initially collected pursuant to GDPR Article 6(4); 2) Profiling vulnerable persons including children to target marketing or online services at such persons; 3) Use of profiling or algorithmic means or special category data as an element to determine access to services or that results in legal or similarly significant effects; 4) Systematically monitoring, tracking or observing individuals’ location or behaviour; 5) Profiling individuals on a large-scale; 6) Processing biometric data to uniquely identify an individual or individuals or enable or allow the identification or authentication of an individual or individuals in combination with any of the other criteria set out in WP29 DPIA Guidelines; 7) Processing genetic data in combination with any of the other criteria set out in WP29 DPIA Guidelines; 8) Indirectly sourcing personal data where GDPR transparency requirements are not being met, including when relying on exemptions based on impossibility or disproportionate effort; 9) Combining, linking or cross-referencing separate datasets where such linking significantly contributes to or is used for profiling or behavioural analysis of individuals, particularly where the data sets are combined from different sources where processing was/is carried out for difference purposes or by different controllers; 10) Large scale processing of personal data where the Data Protection Act 2018 requires “suitable and specific measures” to be taken in order to safeguard the fundamental rights and freedoms of individuals. This list does not remove the general requirement to carry out proper and effective risk assessment and risk management of proposed data processing operations nor does it exempt the controller from the obligation to ensure compliance with any other obligation of the GDPR or other applicable legislation. Furthermore, it is good practice to carry out a DPIA for any major new project involving the use of personal data, even if there is no specific indication of likely high risk.(From DPC Guidelines available here). Ultimate responsibility rests with the controller, as it is the controller who must decide whether or not a ‘high risk’ exists.  Such a decision must take a host of factors into account. When two or more of these factors combine in the processing operation, the risk is sure to increase. For example, a processing operation could involve new technology, the processing of sensitive data and profiling/evaluation. The factors are not prescriptive but the office of the DPC has identified some that warrant special attention. These factors include: Uses of new or novel technologies; Data processing on a large scale; Profiling/Evaluation – Evaluating, scoring, predicting of individuals’ behaviours, activities, attributes including location, health, movement, interests, preferences; Any systematic monitoring, observation or control of individuals including that taking place in a public area or where the individual may not be aware of the processing or the identity of the data controller; Processing of sensitive data including that as defined in GDPR Article 9, but also other personally intimate data such as location and financial data or processing of electronic communications data; Processing of combined data sets that goes beyond the expectations of an individual, such as when combined from two or more sources where processing was carried out for different purposes or by different data controllers; Processing of personal data related to vulnerable individuals or audiences that may have particular or special considerations related to their inherent nature, context or environment. This will likely include minors, employees, mentally ill, asylum seekers, the aged, those suffering incapacitation; Automated decision making with legal or significant effects (see below). This includes automatic decision making where there is no effective human involvement in the process; and Insufficient protection against unauthorised reversal of pseudonymisation. Under Art. 35(5) it is open to any Supervising Authority, in our case the DPC, to set out a list of the kind of processing operations for which no data protection impact assessment is required. A definitive list pursuant to Art. 35(5) has not been issued by the DPC. A general rule is that any processing that is not ‘likely to result in a high risk to the rights and freedoms of natural persons’ will be exempt from a DPIA.  However, deciding what is, ‘likely to result in a high risk…’ demands the carrying out of a ‘mini DPIA’. Despite the absence of a comprehensive definitive list, the office of the DPC, in a publication on DPIAs, lays out some examples of processing operations not requiring a DPIA: A previous DPIA was carried out and found no risk; Processing has been authorised by the DPC; Processing is being done pursuant to (c) or (e) of Art. 6(1) of the GDPR. Point (c) refers to processing necessary for compliance with a legal obligation. Point (e) refers to processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. In both cases there must be a clear legal basis under EU or Member State law AND a DPIA must have already been conducted under Art. 35(10). On balance it is advisable to have a Data Protection  Impact Assessment carried out. In many cases, the minimum content of the assessment as set out under Art.35 (a) to (d) GDPR will be sufficient to ensure compliance and bring peace of mind to an  organisation conducting the processing operations. Here at we are  GDPR and data protection consultants with vast expertise in conducting DPIAs. are located in Carlow/Kilkenny and Mayo, offering a  nationwide service. Call 0858754526 0r 0599134259 to discuss your particular need. Patrick Rowland,   [...]