DPC decision in Tik Tok Case

The DPC decision in the TIK Tok case has been welcomed by data protection practitioners.  It is  is one we have been meaning to discuss but did not get around to it until now. Also, The DPC Annual Report for 2023 has just been released and we will examine it in detail in our upcoming July/August 2024 blog.
As the report states, “2023 was a busy year in personal data rights protection. The year saw a significant increase in complaints dealt with by the Data Protection Commission (“DPC”) with record fines issued and corrective orders imposed following cross-border and national inquiries. More generally, there were a large number of data protection-related judgments from the Court of Justice of the European Union and continued domestic focus before the Irish courts”. Perhaps not coincidentally, 2023 was also a busy year for GDPRXPERT and those operating a data protection advisory service. In particular, there was a high demand for the outsourced data protection officer service provided by GDPRXPERT.ie

Naturally, many of the high profile cases taken and concluded by the DPC in 2023 feature prominently in the report. One of these is the Tik Tok case concluded in September 2023. This concluded with a fine of €345 million for TIK Tok. TikTok Is a video focused social media platform that allows registered users to create and share videos of varying duration and to communicate with others through messages. TTL states that TikTok is not a social network and is, rather, a global entertainment platform that, at its core, was designed to enable users to create and share video content, enjoy videos from a variety of creators and otherwise express their creativity, such as by interacting with videos to express new perspectives and ideas.

The Tik Tok platform is accessible via a standalone mobile phone application and can also be viewed as a web page from a web browser.
This case is worthy of some more analysis and comment because it is a transparent example of the whole process involved , and it is a long and resource intensive process. The Tik Tok case has been welcomed by GDPR and data protection practitioners GDPRXpert.ie. and many others. Indeed, it has been welcomed by GDPRXpert.ie and fellow GDPR and data protection law experts across the EU. In previous blogs here GDPRXpert.ie have emphasised the length of time it can take to bring a case to a conclusion. The length of time it can take to finalise investigations has been the subject of much criticism by some groups who themselves are familiar with the process and its complexities. They should know better and they most likely do, but they persist in rushing to premature judgment in some cases and have done so very notably in one specific case.


These same groups have their own underlying agenda and their criticism of the DPC may not diminish despite clear robust decisions, such as in the Tik Tok case. One particular group we have criticised before seems to have happily deluded itself into believing it is best placed to monitor and defend the data protection rights of all data subjects. It does this at the cost of neglecting areas within its own more direct remit. Its misguided forays into areas already well served by the DPC has seen it succumb to the desire to take frivolous, and arguably vexatious actions paid for by donors and the Irish taxpayer. In its most recent opportunistic legal venture challenging the DPC in the context of Google’s online bidding, the judge sent a clear message by ordering the group to pay the costs of the DPC. It was seen in this case to be totally out of its intellectual comfort zone and showing signs of a dearth of understanding of data protection law.

The Tik Tok case stands as a testament to the quality of the investigations carried out by the DPC. One must emphasise that the Tik Tok investigation was an own volition inquiry but nevertheless involved the coordinated deployment of huge assets and resources. It may be that the nature of an own volition inquiry makes additional resources necessary because of the extra scrutiny they attract by virtue of the GDPR. These types of cases take time; simple as that. This case does demonstrate the multitude of resources that must be strategically expended in order to bring such complex cases to the desired legal conclusion( was subject of Binding Decision 2/2023 on the dispute submitted by the Irish SA regarding TikTok Technology Limited (Art. 65 GDPR) Adopted on 2 August 2023). These cases are complicated even for data protection experts such as GDPRXPERT.ie

Preliminary Issues
Before the case proper a few mainly procedural issues had to be clarified. The first was whether the DPC was competent to act as the Lead Supervisory Authority at all. Under 4(23) GDPR cross border processing is defined as meaning either :
(a) processing of personal data which takes place in the context of the activities of establishments in more than one Member State of a controller or processor in the Union where the controller or processor is established in more than one Member State;
or
(b) processing of personal data which takes place in the context of the activities of a single establishment of a controller or processor in the Union but which substantially affects or is likely to affect data subjects in more than one Member State.
During the period 29 July 2020 and 31December 2020 ,Tik Tok Technology Ltd (TTL) processed personal data in the context of the activities of a single establishment of a controller or processor in the Union but which substantially affects or is likely to substantially affect data subjects in more than one Member State. TTL’s single establishment in Ireland is supported by affiliated entities in Germany, France, Poland, Italy, Spain and Sweden.
There was no doubt that the processing was cross border.


Turning to the question of whether the DPC was competent to act as lead supervisory authority in respect of the processing under examination, the DPC noted that Article 56(1) GDPR provides that a supervisory authority of the main establishment of a controller or processor shall be competent to act as lead supervisory authority pursuant to Article 60 GDPR.
Having considered all of the above and the nature of the processing at issue, the DPC was satisfied that TTL is a data controller within the meaning of Article 4(7) GDPR regarding the processing which is the subject of the inquiry. The DPC was further satisfied that TTL has its main establishment in Ireland for the purposes of the GDPR. As such, the DPC was satisfied that the requirements of Article 56 GDPR had been met in relation to the processing at issue, such that the DPC is competent to act as the lead supervisory authority in respect of the cross-border processing under examination.
So, the DPC is competent to act as Lead Supervisory Authority (LSA).


The next hurdle to be crossed concerned the argument by TTL that the standards of compliance to which it was being held post dated the relevant period of the inquiry. The argument was that the “Fundamentals of a Child -Oriented Approach to Data Processing” (published Dec. 2021) were not in effect at the time of the processing giving rise to the inquiry and therefore, constituted “an impermissible retrospective application of regulatory standards and a clear breach of fair procedures.”
The DPC dismissed this by relying on the plain fact that GDPR was in force at the time, and something such as the Fundamentals represented ancillary guidance to the GDPR, but TTL was obliged to comply with GDPR since May 2018. The Fundamentals referenced the GDPR principles that were in effect in 2018 and although the Fundamentals were not in effect contemporaneously, they did not constitute any form of retrospective law making. They are post GDPR guidance principles only. The date of their release is immaterial and irrelevant. TTL’s compliance was to be assessed in the light of the GDPR and any guidance notes and material available during the relevant period.
Time for Substantive Issues.
So then, what was the DPC actually investigating in the case?

Material Scope.


This inquiry concerned the processing by TTL of personal data of registered child users of the TikTok platform and whether or not TTL had complied with its obligations under the GPR as data controller. The 2018 Act provides that the term ‘child’ in the GDPR is retaken as a reference to a person under the age of 18 years. TTL provides the TikTok platform to persons over the age of 13. As a result, the term child users in this decision should be taken as a reference to registered TikTok users who are aged between 13 and 17 years old. As set out below, this inquiry also examined certain issues regarding TTL’s processing of personal data relating to children under the age of 13.
In particular, this inquiry concerned two distinct sets of processing operations by TTL in the context of the TikTok platform, both of which constituted the processing of personal data as defined by Article 4(2) GDPR. The inquiry also examined the extent to which TTL complied with its transparency obligations under the GDPR. As highly experienced data protection law consultants, GDPRXpert.ie can testify to the challenges organisations face in meeting transparency standards set by GDPR. The standards are high and will not be met without a strategy being devised and followed. Our GDPR audits often discover that organisations have no strategy at all.
Broadly, the first type of processing to be examined relates to the processing of child users’ personal data in the context of the platform settings of the TikTok platform, both mobile application and website based, in particular published by default processing of such platform sessions in relation to Child Users’ accounts, videos, comments, ‘Duet’ and ‘Stitch’, downloading and ‘Family Pairing’.
The second type of processing to be examined related to the processing by TTL of the personal data of children under the age of 13 in the context of the TikTok platform, both mobile application and website based, in particular for the purposes of age verification.
Finally, with regard to the processing of personal data of persons under the age of 18 in the context of the TikTok platform (including any such processing connection with websites or applications which provide access to the TikTok platform), this inquiry also examined if TTL had complied with its obligations to provide information to data subjects in the form and manner required by Articles 12(1). 13(1)(e), 13(2)(a), 13(2)(b) and 13(2)(f) GDPR.

Assessment of TTL’s Compliance with the GDPR and Corrective Powers.

The Statement of Issues identified the matters for determination as part of the inquiry. Those issues concerned TTL’s compliance with the GDPR ( and consideration of corrective powers), as follows:
Firstly, in relation to platform settings:
• A. Whether, having regard to the default public settings applied to Child Users’ accounts, TTL implemented appropriate technical and organizational measures pursuant to Article 24, GDPR, to ensure and to be able to demonstrate that its processing of Child Users’ personal data was performed in accordance with the GDPR;
• B. Whether, having regard to the default public settings applied to Child Users’ accounts TTL complied with its obligations under Article 5(1)(c) and 25(1) GDPR to ensure that its processing of Child Users’ personal data was adequate, relevant and limited to what is necessary in relation to the purposes for which they were processed; and to implement appropriate technical and organizational measures designed to implement the data minimisation principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this regulation and protect the rights of data subjects;
• C. Whether, having regard to the public default settings applied to Child Users’ accounts, TTL complied with its obligations under Article 25(2) GDPR to implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing were processed; and
• D. Whether, in circumstances where TTL’s platform settings allowed an unverified non Child User to access and control a Child Users’ Platform settings, TTL complied with its obligations under articles 5(1)(f) and 25(1) GDPR to ensure that its processing of Child Users personal data was processed in a manner that ensured appropriate security of the personal data, including protection against unauthorized or unlawful processing and against accidental loss, destruction or damage, using appropriate technical organizational measures; and to implement appropriate technical and organisational measures designed to implement the integrity and confidentiality principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

Secondly in relation to age verification:
• Whether, having regard to TTL’s requirement that users of Tik Tok should be aged 13 and above, TTL complied with its obligations under Art. 24 GDPR to implement appropriate technical and organisational measures to ensure and be able to demonstrate that its processing of data of Child Users was performed in accordance with the GDPR, including by implementing measures to ensure against children under 13 years of age accessing the platform ;

• Whether, having regard to TTL’s requirement that users of TikTok should be aged 13 and above, TTL complied with its obligations under Art.5(1)(b),5 (1)(c) and 25(1)GDPR to ensure that it collected Child Users’ personal data for specified , explicit and legitimate purposes and that it did not further process that data in a manner incompatible with those purposes; to ensure that its processing of Child Users’ personal data was adequate ,relevant and limited to what was necessary in relation to the purposes for which they were processed; and to implement appropriate technical and organisational measures designed to implement the purpose limitation and data minimisation principles in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects , including by implementing measures to ensure against children aged under 13 having access to the platform;

• Whether, having regard to TTL’s requirement that users of TIK Tok should be 13 and over , TTL complied with its obligations under Art. 25 (2) GDPR to implement appropriate technical and organisational measures for ensuring that, by default, only personal data which were necessary for each specific purpose of the processing were processed, including by implementing measures to ensure against children aged under 13 accessing the platform.
Thirdly, in relation to transparency:
• Whether Child Users are appropriately made aware as a user of Tik Tok of the various public and private account settings in accordance with Art.5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f); to be read in conjunction with Recitals 38,39, 58, 60 and 61, and whether Child Users are able to determine the scope and consequences of registering as a user, and specifically that their profile will be defaulted to public.
These were the complicated issues that the DPC had to assess in light of the legal regime.Therefore, the first substantive issue arose in regard to platform default settings and in the specific context of child users.
Issue 1: Assessment and consideration of matters concerning TTL’s compliance with articles 5, 24 and 25 GDPR concerning its platform settings for users under age the age of 18.
The root question was whether TTL had complied with its obligations under Articles 5(1)(c), 5(1)(f), 24 and 25 GDPR. Article 5(1)(c) GDPR provides that personal data shall be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.” Per Recital 39, this requires, in particular, ensuring that the period for which the personal data are stored is limited to a strict minimum. Personal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means. In order to ensure that the personal data are not kept longer than necessary, time limits should be established by the controller for erasure or for a periodic review.
Article 5(1)(f) provides that personal data shall be “processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures.” Per Recital 39, personal data should be processed in a manner that ensures appropriate security and confidentiality of the personal data, including for preventing unauthorised access to, or use of, personal data and the equipment used for the processing.
Further, Article 24(1) provides: Taking into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation. Those measures shall be reviewed and updated where necessary.
The GDPR and data protection advisory service provided by GDPRXpert.ie always emphasises the importance attached to the technical and organisational measures that are implemented. It is not enough to implement measures; they must be demonstrably ‘effective’. To quote Recital 74 GDPR, “ The responsibility and liability of the controller for any processing of personal data carried out by the controller or on the controller’s behalf should be established. In particular, the controller should be obliged to implement appropriate and effective measures and be able to demonstrate the compliance of processing activities with this Regulation, including the effectiveness of the measures. Those measures should take into account the nature, scope, context and purposes of the processing and the risk to the rights and freedoms of natural persons.” GDPR audits conducted by GDPRXpert.ie will never overlook this potential pitfall. Unfortunately, it is missed by many.

In regard to Art.25, the European Data Protection Board (EDPB) has published Guidelines on Data Protection by Design and Default. These have summarised Art.25 GDPR as follows: “ The core of the provision is to ensure appropriate and effective data protection both by design and by default, which means that controllers should be able to demonstrate that they have the appropriate measures and safeguards in the processing to ensure that the data protection principles and the rights and freedoms of data subjects are effective. ( European Data Protection Board, ‘Guidelines 4/2019 on Article 25 Data Protection by Design and by Default’ (20 October 2020) at [2]).
As with the technical and organisational measures referred to above, it is the effectiveness of the implemented measures that is crucial. Each implemented measure should produce the intended results for the processing foreseen by the controller and this has two consequences as laid out in the EDPB Guidelines:
“First, it means that Article 25 does not require the implementation of any specific technical and organisational measures, rather that the chosen measures and safeguards should be specific to the implementation of data protection principles into the particular processing in question. In doing so, the measures and safeguards should be designed to be robust and the controller should be able to implement further measures in order to scale to any increase in risk. Whether or not measures are effective will therefore depend on the context of the processing in question and an assessment of certain elements that should be taken into account when determining the means of processing. …Second, controllers should be able to demonstrate that the principles have been maintained.”

On a daily basis our data protection law consultants here at GDPRXpert.ie are reminded of the interaction of the necessary congruence of technical and organisational measures that are incorporated into processing operations with data protection principles, especially data minimisation.
Article 25(2) GDPR requires data controllers to implement measures to ensure that, by default, the principle of data minimisation is respected, as follows: “The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.” The obligation to implement the measures described in Art 25(2)GDPR is referred to as Data Protection by Default. By default, personal data must not be made accessible without the individual’s intervention to an indefinite number of natural persons.
As we often remind clients; no specific measures are prescribed by GDPR. If you have a toothache you want to take something to kill that pain but experience will tell you the same medication may not be effective against pain in another location. It is, therefore, controllers who know the appropriate measures that historically have worked. They should assess the risk and implement what will work for an increased risk in the future.

 

 

 

TTL responded to the Art. 5.(1) (c) and submitted that the data minimisation principle did not mandate processing the absolute minimum but rather, what was adequate, relevant and limited to what was necessary in relation to the purposes of the processing and which respected the principle of proportionality.
Their submission on Art.24 was detailed and focused mainly on a few core points. The first was that Arts.24(1) and 24(2) did not impose prescriptive obligations as that would be inconsistent with the objectives of the provisions themselves, which is to embed privacy compliance practices into the internal practices of an organisation in a manner appropriate to the processing activities undertaken by a particular organisation. TTL cited Art.29 Working Party as support for the argument that a one size fits all approach would force controllers into structures that are ‘unfitting and ultimately fail’. Tik Tok took the view that the only option was ‘custom built solutions. By reasonable interpretation, the accountability obligations under Art.24(1) could only be open-ended and non-prescriptive.


TTL also argued that the GDPR does not mandate the exact means of achieving or demonstrating compliance with its requirements. Taking such a prescriptive approach would be inconsistent with the core objective of Art.24 which, as we have just stated, is to embed privacy compliance into the internal practice of organisations in a manner that works for each organisation while remaining aligned with GDPR principles. Data protection consultants GDPRXPERT.ie would agree with TTL that with Art.24 there is a much more holistic approach to data protection compliance and indeed, there has to be. If not, there would have to be a formulaic, universal manner by which an organisation had to display compliance.
We know that under Art.24 GDPR the controllers are required to implement technical and organisational measures to ensure that processing is carried out in accordance with the GDPR and be able to demonstrate such compliance. In order to meet the requirement controllers must make an assessment of (1) the nature, scope , context and purposes and (2) the risks of varying likelihood and severity for the rights and freedoms of natural persons.
TTL contended therefore that there was sufficient leeway within the confines of the Article to allow for some discretion as to what was an appropriate measure. They further argued that it is clear, therefore, that the appropriateness of the measures adopted must be informed by an assessment of the context and purposes of processing, as well as the risks which may result from the processing (if any). As explained above, TikTok’s mission is to inspire creativity and bring joy. The core purpose of the Platform during the Relevant Period was to enable Users to disseminate their own content and to show Users content they are likely to find of interest.
TikTok then moved to their position in relation to Art.25 and stated “ Article 25(1) GDPR does not solely focus on user controlled settings as a technical measure but also addresses technical measures more broadly (including ones that are not user controlled) and organisational measures. As such, TikTok as a data controller is afforded autonomy and appropriate latitude in determining the specific designs of its product.” The measures to be adopted in Article 25(1) GDPR should be commensurate with the risks posed by the processing, and those risks should be weighed by their likelihood and severity.


TTL then looked to opinions from the EDPB to bolster their argument. “The European Data Protection Board (“EDPB”) Article 25 Data Protection by Design and Default Guidelines (“Article 25 Guidelines”) recognise that Article 25(1) GDPR does not envisage a one-size fits all approach to data protection. The EDPB Article 25 Guidelines further state “[w]hen performing the risk analysis for compliance with Articles 25, the controller has to identify the risks to the rights of data subjects that a violation of the principles presents and determine their likelihood and severity in order to implement measures to effectively mitigate the identified risks.”
Article 25(2) states, among other things that “the controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed”. Article 25(2) requires that, by default, only the personal data that is necessary for each specific purpose is processed. It is the responsibility of the controller to define the purpose of the processing, and by doing so, the controller also determines the scope of the processing required for that particular purpose. Article 25(2), therefore, requires implementing default settings to processing that is necessary to achieve the controller’s purpose. Article 25(2) is not prescriptive as to the type of technical and organisational measures that must be implemented to ensure data protection by default. The EDPB has recognised that a range of different measures, including enabling data subjects to intervene in the processing, could be involved “depending on the context and risks associated with the processing in question”.

The context of the processing is central to the consideration as to what measures are appropriate in the given circumstances and to what extent they will implement data protection principles effectively. In particular, Article 25(2) does not require controllers to choose default settings which would subvert the core functionalities of their service. In order to comply with Article 25(1) GDPR, controllers are asked to weigh a multitude of broad and abstract concepts, assess the risks, and then determine “appropriate” measures. Each of these elements is opaque and open to interpretation, and as a result, no two assessments performed in accordance with Article 25 will look the same. Article 25(1) requires “appropriate” measures, which when applied to age verification would mean that a controller is required to implement measures to determine the age of users with an appropriate, rather than absolute, level of certainty.


Article 25(1) GDPR does not prescribe the appropriate technical and organisational measures designed to implement the data protection principles (including the data minimisation principle) that organisations are required to put in place. Controllers are similarly afforded autonomy and appropriate latitude under Article 25(2) GDPR in determining the appropriate measures for ensuring privacy by default. The European Data Protection Board (“EDPB”) in its Article 25 Data Protection by Design and by Default Guidelines (“Article 25 Guidelines”) explains that being appropriate means that the measures and necessary safeguards should be suited to achieve the intended purpose, i.e. they must implement the data protection principles effectively” and that “the controller must verify the appropriateness of the measures for the particular processing in question”.
Further, in considering whether the measures put in place by TikTok complied with Article 25(1) GDPR, account must be taken, in particular, of the “context and purposes of processing”. In this regard, full consideration must be given to the benefits of the relevant features to Users and their importance to the core purpose of TikTok during the Relevant Period as described above, which would have informed younger Users’ expectations, and the measures and privacy settings designed to safeguard younger Users.
All of these submissions had then to be taken into account by the DPC.
The first finding by the DPC
At the time of the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. In this regard, the DPC was of the view that TTL failed to implement appropriate technical and organisational measures to ensure that, by default, only personal data which were necessary for TTL’s purpose of processing were processed. In particular, this processing was performed to a global extent and in circumstances where TTL did not implement measures to ensure that, by default, the social media content of Child Users was not made accessible (without the user’s intervention) to an indefinite number of natural persons. It was held by the DPC that the above processing by TTL was contrary to the principle of data protection by design and default under Article 25(1) and 25(2) GDPR, and contrary to the data minimisation principle under Article 5(1)(c) GDPR.

The second finding by the DPC:
During the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. The above processing posed severe possible risks to the rights and freedoms of Child Users. In circumstances where TTL did not properly take into account the risks posed by the above processing, the DPC took the position that TTL did not implement appropriate technical and organisational measures to ensure that the above processing was performed in accordance with the GDPR, contrary to Article 24(1) GDPR.
The third finding by the DPC
During the Relevant Period, TTL implemented a platform setting – called ‘Family Pairing’ for Child Users whereby a non-Child User could pair their account to that of the Child User. This platform setting allowed the non-Child User to enable direct messages for Child Users above the age of 16. The above processing posed severe possible risks to the rights and freedoms of Child Users. In circumstances where this processing does not ensure appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures; and TTL failed to implement appropriate technical and organisational measures designed to implement the integrity and confidentiality principle in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects. DPC took the view that this processing was not performed in accordance with the GDPR, contrary to Article 5(1)(f) and Article 25(1) GDPR. ( at p.42)
The DPC also made an assessment and consideration of matters concerning age verification pursuant to articles 24 and 25 GDPR
The fourth finding by the DPC.
During the Relevant Period, TTL implemented a default account setting for Child Users which allowed anyone (on or off TikTok) to view social media content posted by Child Users. The above processing posed severe possible risks to the rights and freedoms of Child Users. This also posed several possible risks to the rights and freedoms of children under the age of 13 who gained access to the platform. In circumstances where TTL did not properly take into account the risks posed by the above processing to children under the age of 13, the DPC was of the view that TTL did not implement appropriate technical and organisational measures to ensure and to be able to demonstrate that the above processing was performed in accordance with the GDPR, contrary to Article 24(1) GDPR.
The DPC also examined transparency requirements under Arts. 5, 12, and 13 GDPR but at this stage we will not go through all of TTL’s submissions and we will conclude with the transparency obligations and the remaining DPC findings.
The first transparency obligation for consideration was whether Child Users were appropriately made aware (in a concise, transparent, intelligible and easily accessible form, using clear and plain language) by TTL as a user of the TikTok platform of the various public and private account 239. settings in accordance with Articles 5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f) GDPR; to be read in conjunction with Recitals 38, 39, 58, 60 and 61 GDPR, and whether Child Users are able to determine the scope and the consequences of registering as a user, whether public or private.
The second transparency obligation for consideration was whether Child Users were appropriately made aware by TTL as a user of the TikTok platform of the public default setting in accordance with Articles 5(1)(a), 12(1), 13(1)(e), 13(2)(a) and 13(2)(f) GDPR; to be read in conjunction with Recitals 38, 39, 58, 60 and 61 GDPR, and whether Child Users were able to determine the scope and the consequences of registering as a user, and specifically that their profile would be defaulted to public.
Finding 5
In circumstances where TTL did not provide Child Users with information on the categories of recipients or categories of recipients of personal data, DPC found that TTL has not complied with its obligations under Article 13(1)(e) GDPR.
In circumstances where TTL did not provide Child Users with information on the scope and consequences of the public-by-default processing (that is, operating a social media network which, by default, allows the social media posts of Child Users to be seen by anyone) in a concise, transparent and intelligible manner and in a form that is easily accessible, using clear and plain language, in particular insofar as the very limited information provided did not make it clear at all that this would occur, the DPC found that TTL had not complied with its obligations under Article 12(1) GDPR.
Finding 6:
For the reasons established by the EDPB in the Article 65 Decision, TTL infringed the principle of fairness pursuant to Article 5(l)(a) GDPR.
TTL were ordered to bring their processing into compliance, received a reprimand and were fined in total €345 million.
We have done a sort of synopsis of the main issues in this case but it is impossible to fully do justice to the effort on the part of the DPC to uphold and vindicate the rights of data subjects, often achieved despite the criticism levelled at the office from the usual suspects. However, it should give readers a glimpse into the complexity of some of the cases that land on the desk of the DPC.

GPRXPERT.ie offers a comprehensive  data protection consultancy service with particular emphasis on the onerous responsibilities placed on organisations under the GDPR.

Patrick Rowland for GDPRXPERT.ie

www.gdprxpert.ie