Data Protection Impact Assessments-yes or no?

A Data Protection Impact Assessment ( DPIA) is one  of the most responsible tasks that, in certain circumstances,  is prescribed under the GDPR. Non compliance with DPIA requirements can lead to the imposition of fines by the DPC. Any reputable data protection consultancy should have qualified, certified and experienced data protection professionals available to carry out DPIAs on your behalf. At we routinely undertake DPIAs as part of our services. This service is available nationwide. Data protection consultants have found that even  in the cases where a DPIA is not mandatory, it is always an advisable course of action.

What is a data protection impact assessment?

A Data Protection Impact Assessment is a process specifically designed to identify, quantify and mitigate the risks involved in the processing operation. It does this primarily by assessing the necessity and proportionality of the processing and putting a strong emphasis on managing the risks to the rights and freedoms of all natural persons resulting from the processing of personal data. Therefore, an essential ingredient in any DPIA mix is a measured assessment of the risks to those rights and freedoms, and a determination of the appropriate measures to address them.

At the heart of the DPIA is its role as a conduit of accountability that works to enable controllers to comply with their requirements under GDPR. By using this accountability tool a controller can demonstrate that all appropriate measures have been taken to ensure compliance with the Regulation. In essence, the DPIA is the building block to construct and demonstrate compliance. Data protection consultants will provide the foundation for you to build and construct a compliant business  structure.

DPIA Content

Article 29 Working Party elaborates on the details. The GDPR does not formally define the concept of a DPIA as such, but – its minimal content is specified by Article 35(7) as follows:

“(a) a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;

(b) an assessment of the necessity and proportionality of the processing operations in relation to the purposes;

(c) an assessment of the risks to the rights and freedoms of data subjects ;

and (d) the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data, and to demonstrate compliance with this Regulation, taking into account the rights and legitimate interests of data subjects and other persons concerned”.

Recital 84 goes on to clarify the role in the following terms; “In order to enhance compliance with this Regulation where processing activities are likely to result in a high risk to the rights and freedoms of natural persons, the controller should be responsible for the carrying-out of a data protection impact assessment to evaluate in particular, the origin, nature, particularity and severity of that risk”.

The same Recital continues; “The outcome of the assessment should be taken into account when determining the appropriate measures to be taken in order to demonstrate that the processing of personal data complies with this Regulation”.


Is a Data Protection Impact Assessment Mandatory?


A DPIA is not mandatory for every personal data processing operation. Indeed, the risk-based approach inherent in the GDPR requires only that a DPIA be carried out when the processing is “likely to result in a high risk to the rights and freedoms of natural persons” (Article 35(1)). There is no necessity for a certainty, but inherently high risk should attract more scrutiny. Article 35 (3) states a DPIA “shall in particular be required in the case of:

“(a) a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person ;

(b) processing on a large scale of special categories of data referred to in Article 9(1), or of personal data relating to criminal convictions and offences referred to in Article 10 ;

or (c) a systematic monitoring of a publicly accessible area on a large scale”.

The words above, ‘in particular’ (in bold), from Art.35 (3), signify that the list is deliberately non-exhaustive. One practical consequence is that there will be cases that do not fall neatly into any ‘high risk’ category, but yet they pose a quantifiably high risk.  To make the assessment on whether a DPIA is mandatory or not, in itself, involves a risk assessment or a sort of mini DPIA. What is ‘likely to result in high risks…?’ How is the ‘high risk’ to be assessed?

Recital 84 places emphasis on evaluating the ‘origin, nature, particularity and severity of the risk.’ A general backdrop to the high risk potential includes aspects such as the nature, the context, the scope and the purposes of the processing. Prudent advice from the Art. 29 WP Guidelines is that where it is not clear whether a DPIA is required, a DPIA should nonetheless be carried out to help data controllers comply with data protection law.

Some Other Criteria for A DPIA.

There is then, what might be called, ‘an assessment before an assessment’. Art. 35(4) envisages the establishment of a list of processing operations that would guide controllers in their scrutiny of operations that may require a DPIA. Art. 29 WP lays out the relevant criteria to be considered in this regard:

  1. Evaluation or scoring, including profiling and predicting, especially from “aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements” (Recitals 71 and 91). Examples of this could include a bank that screens its customers against a credit reference database, or a biotechnology company offering genetic tests directly to consumers in order to assess and predict the disease/health risks, or a company building behavioural or marketing profiles based on usage or navigation on its website;
  2. Automated-decision making with legal or similar significant effect: processing that aims at taking decisions on data subjects producing “legal effects concerning the natural person” or which “similarly significantly affects the natural person” (Article 35(3)(a)). For example, the processing may lead to the exclusion or discrimination against individuals. Processing with little or no effect on individuals does not match this specific criterion;
  3. Systematic monitoring: processing used to observe, monitor or control data subjects, including data collected through “a systematic monitoring of a publicly accessible area” (Article 35(3) (c)). This type of monitoring is a criterion because the personal data may be collected in circumstances where data subjects may not be aware of who is collecting their data and how they will be used. Additionally, it may be impossible for individuals to avoid being subject to such processing in frequent public (or publicly accessible) space(s);
  4. Sensitive data: this includes special categories of data as defined in Article 9 (for example information about individuals’ political opinions), as well as personal data relating to criminal convictions or offences. An example would be a general hospital keeping patients’ medical records or a private investigator keeping offenders’ details. This criterion also includes data which may more generally be considered as increasing the possible risk to the rights and freedoms of individuals, such as electronic communication data, location data, financial data (that might be used for payment fraud). In this regard, whether the data has already been made publicly available by the data subject or by third parties may be relevant. Where  personal data is publicly available, this aspect   may be considered as a factor in the assessment if the data was expected to be further used for certain purposes. This criterion may also include information processed by a natural person in the course of purely personal or household activity (such as cloud computing services for personal document management, email services, diaries, e-readers equipped with note-taking features, and various life-logging applications that may contain very personal information), whose disclosure or processing for any other purpose than household activities can be perceived as very intrusive;
  5. Data processed on a large scale: the GDPR does not define what constitutes large-scale, though Recital 91 provides some guidance. In any event, the WP29 recommends that the following factors, in particular, be considered when determining whether the processing is carried out on a large scale:

(a)the number of data subjects concerned, either as a specific number or as a proportion of the relevant population;

(b)the volume of data and/or the range of different data items being processed;

(c)the duration, or permanence, of the data processing activity;

(d)the geographical extent of the processing activity.

6.Datasets that have been matched or combined, for example originating from two or more data processing operations performed for different purposes and/or by different data controllers in a way that would exceed the reasonable expectations of the data subject.;

(7)Data concerning vulnerable data subjects (Recital 75): the processing of this type of data can require a DPIA because of the increased power imbalance between the data subject and the data controller, meaning the individual may be unable to consent to, or oppose, the processing of his or her data. For example, employees would often meet serious difficulties to oppose to the processing performed by their employer, when it is linked to human resources management. Similarly, children can be considered as not able to knowingly and thoughtfully oppose or consent to the processing of their data. This also concerns more vulnerable segment of the population requiring special protection, such as, for example, the mentally ill, asylum seekers, or the elderly, a patient, or in any case, where an imbalance in the relationship between the position of the data subject and the controller can be identified;

(8)Innovative use or applying technological or organisational solutions, like combining use of finger print and face recognition for improved physical access control, etc. The GDPR makes it clear (Article 35(1) and Recitals 89 and 91) that the use of a new technology can trigger the need to carry out a DPIA. This is because the use of such technology can involve novel forms of data collection and usage, possibly with a high risk to individuals’ rights and freedoms. Indeed, the personal and social consequences of the deployment of a new technology may be unknown. A DPIA will help the data controller to understand and to treat such risks. For example, certain “Internet of Things” applications could have a significant impact on individuals’ daily lives and privacy; and therefore require a DPIA;

(9)Data transfer across borders outside the European Union (Recital 116), taking into consideration, amongst others, the envisaged country or countries of destination, the possibility of further transfers or the likelihood of transfers based on derogations for specific situations set forth by the GDPR; ( put in link to my article )

(10)When the processing in itself “prevents data subjects from exercising a right or using a service or a contract” (Article 22 and Recital 91). This includes processing performed in a public area that people passing by cannot avoid, or processing that aims at allowing, modifying or refusing data subjects’ access to a service or entry into a contract. An example of this is where a bank screens its customers against a credit reference database in order to decide whether to offer them a loan. The WP29 considers that the more criteria are met by the processing, the more likely it is to present a high risk to the rights and freedoms of data subjects, and therefore to require a DPIA.

As a rule of thumb, a processing operation meeting less than two criteria may not require a DPIA due to the lower level of risk, and processing operations which meet at least two of these criteria will require a DPIA. However, in some cases, a processing meeting only one of these criteria will require a DPIA. Conversely, if the controller believes that despite the fact that the processing meets at least two criteria, it is considered not to be “likely high risk”, he has to thoroughly document the reasons for not carrying out a DPIA. In addition, a data controller subject to the obligation to carry out the DPIA “shall maintain a record of processing activities under its responsibility”( Art. 30 (1), including, inter alia, the purposes of processing, a description of the categories of data and recipients of the data and “where possible, a general description of the technical and organisational security measures referred to in Article 32(1)”, and must assess whether a high risk is likely, even if they ultimately decide not to carry out a DPIA.

Note: supervisory authorities are required to establish, make public and communicate a list of the processing operations that require a DPIA to the European Data Protection Board (EDPB) (Article 35(4)). The criteria set out above can help supervisory authorities to constitute such a list, potentially with more specific content added in time if appropriate. For example, the processing of any type of biometric data or that of children could also be considered as relevant for the development of a list pursuant to Article 35(4).

The DPC has issued guidelines on processing operations that require a DPIA. Where a documented screening or preliminary risk assessment indicates the processing operation is likely to result in a high risk to the rights and freedoms of individuals pursuant to Art.35 (1) the DPC has determined a DPIA will also be mandatory for the following types of processing operations:

1) Use of personal data on a large-scale for a purpose(s) other than that for which it was initially collected pursuant to GDPR Article 6(4);

2) Profiling vulnerable persons including children to target marketing or online services at such persons;

3) Use of profiling or algorithmic means or special category data as an element to determine access to services or that results in legal or similarly significant effects;

4) Systematically monitoring, tracking or observing individuals’ location or behaviour;

5) Profiling individuals on a large-scale;

6) Processing biometric data to uniquely identify an individual or individuals or enable or allow the identification or authentication of an individual or individuals in combination with any of the other criteria set out in WP29 DPIA Guidelines;

7) Processing genetic data in combination with any of the other criteria set out in WP29 DPIA Guidelines;

8) Indirectly sourcing personal data where GDPR transparency requirements are not being met, including when relying on exemptions based on impossibility or disproportionate effort;

9) Combining, linking or cross-referencing separate datasets where such linking significantly contributes to or is used for profiling or behavioural analysis of individuals, particularly where the data sets are combined from different sources where processing was/is carried out for difference purposes or by different controllers;

10) Large scale processing of personal data where the Data Protection Act 2018 requires “suitable and specific measures” to be taken in order to safeguard the fundamental rights and freedoms of individuals. This list does not remove the general requirement to carry out proper and effective risk assessment and risk management of proposed data processing operations nor does it exempt the controller from the obligation to ensure compliance with any other obligation of the GDPR or other applicable legislation. Furthermore, it is good practice to carry out a DPIA for any major new project involving the use of personal data, even if there is no specific indication of likely high risk.(From DPC Guidelines available here).

Ultimate responsibility rests with the controller, as it is the controller who must decide whether or not a ‘high risk’ exists.  Such a decision must take a host of factors into account. When two or more of these factors combine in the processing operation, the risk is sure to increase. For example, a processing operation could involve new technology, the processing of sensitive data and profiling/evaluation. The factors are not prescriptive but the office of the DPC has identified some that warrant special attention.

These factors include:

  • Uses of new or novel technologies;
  • Data processing on a large scale;
  • Profiling/Evaluation – Evaluating, scoring, predicting of individuals’ behaviours, activities, attributes including location, health, movement, interests, preferences;
  • Any systematic monitoring, observation or control of individuals including that taking place in a public area or where the individual may not be aware of the processing or the identity of the data controller;
  • Processing of sensitive data including that as defined in GDPR Article 9, but also other personally intimate data such as location and financial data or processing of electronic communications data;
  • Processing of combined data sets that goes beyond the expectations of an individual, such as when combined from two or more sources where processing was carried out for different purposes or by different data controllers;
  • Processing of personal data related to vulnerable individuals or audiences that may have particular or special considerations related to their inherent nature, context or environment. This will likely include minors, employees, mentally ill, asylum seekers, the aged, those suffering incapacitation;
  • Automated decision making with legal or significant effects (see below). This includes automatic decision making where there is no effective human involvement in the process; and
  • Insufficient protection against unauthorised reversal of pseudonymisation.

Under Art. 35(5) it is open to any Supervising Authority, in our case the DPC, to set out a list of the kind of processing operations for which no data protection impact assessment is required. A definitive list pursuant to Art. 35(5) has not been issued by the DPC. A general rule is that any processing that is not ‘likely to result in a high risk to the rights and freedoms of natural persons’ will be exempt from a DPIA.  However, deciding what is, ‘likely to result in a high risk…’ demands the carrying out of a ‘mini DPIA’. Despite the absence of a comprehensive definitive list, the office of the DPC, in a publication on DPIAs, lays out some examples of processing operations not requiring a DPIA:

A previous DPIA was carried out and found no risk;

Processing has been authorised by the DPC;

Processing is being done pursuant to (c) or (e) of Art. 6(1) of the GDPR. Point (c) refers to processing necessary for compliance with a legal obligation. Point (e) refers to processing necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. In both cases there must be a clear legal basis under EU or Member State law AND a DPIA must have already been conducted under Art. 35(10).

On balance it is advisable to have a Data Protection  Impact Assessment carried out. In many cases, the minimum content of the assessment as set out under Art.35 (a) to (d) GDPR will be sufficient to ensure compliance and bring peace of mind to an  organisation conducting the processing operations.

Here at we are  GDPR and data protection consultants with vast expertise in conducting DPIAs. are located in Carlow/Kilkenny and Mayo, offering a  nationwide service.

Call 0858754526 0r 0599134259 to discuss your particular need.

Patrick Rowland,


Latest News