The Evolving Role of the Data Protection Officer (DPO) in Modern Compliance

Introduction

In today’s complex regulatory landscape, the role of the Data Protection Officer (DPO) has become more critical – and multifaceted – than ever. The DPO’s remit now extends into overlapping domains of privacy, technology, and security. From ensuring core GDPR compliance to grappling with emerging laws like the EU Artificial Intelligence Act, Digital Operational Resilience Act (DORA), and NIS2 Directive, DPOs serve as the cornerstone of organisational accountability. At XpertDPO, we have witnessed this evolution first-hand. Our team of DPO experts collectively oversees millions of personal records across 50+ countries, providing pragmatic guidance to clients ranging from public bodies to startups. Join us as we explore the full scope of a DPO’s role in 2025. We draw on XpertDPO’s specialists extensive experience and our company ethos, trust, transparency, education, and sustainable compliance as we examine how DPO responsibilities have expanded under new regulations, how DPOs champion the rights of vulnerable individuals and high-risk processing operations, and why an ethical, human-centred approach is paramount in data protection. Whether your organisation is considering an outsourced DPO service or looking for specialist DPO support to reinforce an in-house team, understanding the DPO’s evolving role is key to building a strong compliance foundation.

The Core Responsibilities of a DPO

Under GDPR, a Data Protection Officer’s duties are clearly defined, serving as a baseline for today’s broader expectations. Article 39 of the GDPR outlines tasks including monitoring and auditing compliance, informing and advising the controller or processor of their obligations, raising awareness and training staff, providing advice on Data Protection Impact Assessments (DPIAs), and acting as the contact point for supervisory authorities. In practice, this means the DPO is an independent advisor and watchdog: they help map and check processing activities, ensure policies are followed, and advise on how to embed privacy by design in projects. Crucially, the GDPR mandates that the DPO must be involved “properly and in a timely manner” in all issues relating to personal data. This places the DPO as an important voice in corporate governance. Ideally, a DPO participates in senior management discussions and any decision with data protection implications. By being at the table early, DPOs facilitate compliance and champion a privacy-by-design approach in business processes. In essence, a DPO functions as a “compliance orchestrator”, liaising between stakeholders like management, IT, legal, data subjects, and regulators, to ensure data protection is woven into the organisational fabric.

At XpertDPO, we reinforce these core responsibilities with a pragmatic and client-centric approach. As Stuart Anderson (Founder & CEO) emphasises, doing things “the right way, both ethically and morally” is non-negotiable. Our DPOs do not engage in fear-based compliance or box-ticking exercises. Instead, we focus on educating stakeholders and delivering realistic, risk-based advice that builds trust. For example, rather than issuing academic reports of problems, we provide practical guidance and leadership to implement solutions, helping clients build a robust yet sustainable compliance framework. This approach aligns with our belief that compliance is not about stoking fear of fines, but about empowering organisations to handle data responsibly, ultimately turning regulatory requirements into opportunities to strengthen reputation and efficiency. By fulfilling the DPO’s core tasks with integrity and insight, and by being a visible champion of privacy, the DPO lays the groundwork for a culture of compliance that stands up to scrutiny.

Beyond GDPR: Navigating an Expanding Regulatory Landscape

In 2018, the GDPR set the stage for data protection, but the regulatory horizon has continued to broaden. Today’s DPO must interpret and integrate multiple overlapping frameworks effectively wearing several hats at once. The introduction of the EU Artificial Intelligence Act in 2024 is a prime example. This regulation creates new compliance obligations for organisations deploying AI, especially those developing “high-risk” AI systems. DPOs are now expected to understand AI-specific requirements around data governance, transparency, algorithmic fairness and bias, and to ensure these are addressed alongside GDPR obligations. For instance, if a company uses personal data to train machine learning models, the DPO needs to verify that transparency requirements (e.g. notifying individuals about automated decisions) and risk mitigation measures for AI are in place on top of standard GDPR compliance. Notably, regulators themselves are recognising this convergence: the Irish Data Protection Commission (DPC) recently sought a European Data Protection Board opinion on aspects of AI regulation, underscoring the complexity and urgent need for DPO expertise in managing overlapping compliance regimes.

Beyond AI, other EU initiatives add to the DPO’s portfolio. The EU Political Advertising Regulation (adopted in 2024) directly involves data protection authorities in monitoring political ads, which means DPOs in any organisation engaged in political advertising must expand their oversight to include stringent transparency and consent rules during election periods. Additionally, DPOs cannot ignore the Digital Services Act (DSA) and Digital Markets Act (DMA), while primarily focused on online platforms and competition, these laws intersect with privacy (for example, content moderation data or online advertising data use) and may require DPO input on compliance strategies. Furthermore, classic data protection companions like the ePrivacy Directive remain relevant, especially for industries handling electronic communications or cookies. In sum, DPOs face a “multifaceted digital regulation landscape” that demands a broader knowledge base.

Two critical relative newcomers are reshaping cybersecurity and resilience expectations, which in turn influence the DPO’s role: DORA and NIS2. The Digital Operational Resilience Act (DORA), effective from early 2025, introduced harmonised ICT risk management requirements for banks, insurers, and other financial entities. While DORA is about cybersecurity and operational continuity, it dovetails with data protection, for example, ensuring that personal data remains secure and accessible during cyber incidents or outages is part of both DORA resilience and GDPR integrity requirements. In organisations subject to DORA, a DPO will likely collaborate with risk and IT teams to align data protection controls with the broader operational resilience framework. This might include advising on vendor due diligence (since third-party ICT providers are in scope) and incident response plans so that a cyber attack doesn’t also become a data breach disaster. Meanwhile, the NIS2 Directive (which EU member states transpose into national laws 2024–2025) greatly expands the range of sectors obliged to maintain robust cybersecurity measures and report incidents. NIS2 marks a shift in viewing cybersecurity as not just an IT issue but a governance and legal compliance issue which is something DPOs are well-positioned to appreciate. As NIS2 widens from 6 sectors to 23 sectors (including healthcare, energy, transport, public administration, digital providers, and more), many more organisations will fall under both NIS2 and GDPR. This convergence means DPOs will work more closely with CISOs and security officers, embedding privacy considerations into cyber risk management and breach response, and ensuring that personal data breach notification (under GDPR) aligns with NIS2’s incident reporting duties. In effect, the DPO becomes a key player in enterprise cybersecurity governance, helping bridge the gap between technical security measures and legal compliance.

The net effect of these developments is a dramatic increase in the scope and complexity of the DPO’s role. Internally, we describe this as the DPO moving from “GDPR specialist” to holistic compliance strategist. A June 2025 analysis in our team noted how this evolving landscape “significantly increases the complexity and breadth of responsibilities falling under DPOs”. DPOs must broaden their expertise to cover AI ethics, advertising rules, sectoral laws, and cybersecurity standards. This brings challenges: continuous training needs, potential resource gaps, and greater risk exposure for organisations that fall behind. Overlapping regulations mean that non-compliance risks now come from multiple angles, not just data protection authorities, but also consumer protection, AI oversight bodies, banking regulators, etc. Keeping up can strain even well-resourced compliance teams. As the DPC observed in its annual report, regulatory expectations are rising, and organisations must empower their DPOs with adequate resources to meet these demands. For many, this means investing in training, better tools, and perhaps augmenting the DPO function with external expertise (a point we’ll return to when discussing support models). The takeaway is clear: a modern DPO operates in a broad regulatory context, acting as a lynchpin to interpret and reconcile various compliance requirements. With the right support, a DPO can turn this challenge into an opportunity  by helping the organisation stay ahead of the curve and avoid the costly scramble of last-minute compliance firefighting.

Driving Accountability: RoPA and Governance Structures

An effective DPO establishes strong governance practices to ensure ongoing accountability. A fundamental tool in the DPO’s kit is the Record of Processing Activities (RoPA). Article 30 GDPR requires organisations to document what personal data they process, for what purposes, where it flows, how long it’s kept, and what safeguards protect it. While this can seem like a paperwork exercise, a well-maintained RoPA is actually the backbone of a privacy governance program. It demonstrates accountability (fulfilling GDPR’s Article 5(2) principle) and provides an up-to-date map of the organisation’s data processing landscape. XpertDPO’s experience has shown that a comprehensive RoPA is more than a compliance checklist – it’s a practical dashboard for the DPO and stakeholders. For example, our team RoPA development methodology for clients involves using the “5Ws” methodology (Who, What, Why, Where, When) to capture each processing activity in detail. This granular approach ensures that for every process we documented the purpose and legal basis (Why), the data involved (What), the responsible parties and affected data subjects (Who), the IT systems and storage locations (Where), and the retention and security measures (When). The result is a living document that supports transparency, audit readiness, and internal oversight, and links directly to other compliance tasks like DPIAs and policy reviews. As we noted above, a good record forms the foundation for managing data subject rights, conducting DPIAs, responding to information requests, and handling breaches. In other words, when a DPO has a clear picture of the data flows, they can more easily flag risk areas, respond to incidents, and guide decision-making across departments.

Beyond maintaining records, DPOs influence the broader governance structure for data protection. This often involves establishing or advising a privacy governance committee, defining data protection policies and standard operating procedures, and integrating data protection into corporate risk management. For example, DPOs might introduce a DPIA procedure, a data breach response plan, and regular compliance reporting to the executive or board level. At XpertDPO, our DPOs frequently help clients set up governance frameworks that fit their context, be it a multinational tech company or a local government authority. A recurring theme is ensuring clarity of roles and inter-departmental cooperation. The DPO often works as a liaison between legal, IT, HR, and business units to implement consistent practices. A notable area is interagency data sharing, especially for public sector bodies that must share data to deliver services. Here, the DPO’s governance role is crucial in drafting data sharing agreements or MOUs, defining each party’s responsibilities, and ensuring compliance with laws while enabling the underlying public interest task. We’ve seen how poor frameworks can impede critical work: a recent analysis of child protection services in Ireland highlighted how the lack of a statutory data-sharing framework between agencies (e.g. child and family services, police, health) created hesitancy and gaps in protecting children. DPOs in such sectors strive to prevent data protection from becoming a barrier by advocating for “safe harbours” in the law or clear guidelines that allow information exchange when it’s truly necessary to protect vital interests like a child’s safety. The Irish DPC has explicitly stated that “data protection should never be used as an excuse, blocker or obstacle to sharing information where doing so is necessary to protect the vital interests of a child”. A DPO upholds this principle by ensuring that staff understand the circumstances under which data can be shared lawfully (for example, under GDPR’s vital interest legal basis or relevant national laws), and by putting in place protocols so that urgent information flows happen with proper safeguards (such as documented risk assessments and post-sharing reviews).

In summary, strong governance and record-keeping are where the DPO’s diligence shines. By maintaining comprehensive RoPAs and clear procedures, the DPO helps the organisation stay audit-ready and prevents compliance from slipping through the cracks as business and technology evolve. Governance is also the arena where the DPO’s value to the leadership becomes evident: through regular reporting and metrics (e.g. number of DPIAs done, training completed, incidents managed), the DPO gives the board or management a clear view of the organisation’s data protection posture. This reporting elevates data protection to a standing item on the governance agenda, fostering a culture of accountability from the top down.

Privacy by Design and Data Protection Impact Assessments (DPIAs)

Given the rapid pace of digital innovation, one of the DPO’s key roles is to operationalise privacy by design, embedding data protection considerations into new projects, systems, or business processes from the outset. The primary tool for this is the Data Protection Impact Assessment (DPIA). Whenever an initiative is likely to result in high risk to individuals (think: deploying surveillance cameras, launching a customer profiling program, rolling out a new health app, or implementing AI to make decisions about people), a DPIA is required under GDPR. The DPO is central to this process: GDPR explicitly requires controllers to seek the DPO’s advice on DPIAs (GDPR Article 35(2)). In practice, we find the DPO often initiates or coordinates the DPIA process, because business teams may not always recognise when a DPIA is needed.

A DPO brings methodology to DPIAs by identifying high-risk indicators such as use of special category data, large-scale profiling, vulnerable individuals’ data, or innovative tech usage. These factors raise red flags that trigger a deeper examination. In client work, examples we would note include multiple high-risk factors: the collection of sensitive data (health and criminal check information), processing affecting thousands of people within a mandated system nationwide, and involvement of children’s data through vetting processes. As DPOs, we ensure the organisations recognise these triggers and complete a DPIA. We help outline the assessment to cover all bases: necessity and proportionality of the new processing, identification of privacy risks, consultation of stakeholders, and documentation of mitigation measures like access controls and data minimisation. We advise the client to integrate the DPIA timeline with their project timeline so that privacy considerations could influence design decisions (rather than being an afterthought). By doing so, when new regulations or systems go live, privacy safeguards are baked in (for example, forms redesigned to limit data collected, clear consent language added, extra encryption on the database, etc.), and the residual risks come down to an acceptable level.

Conducting thorough DPIAs not only protects individuals but also the organisation as it’s far better to catch a potential compliance issue or design flaw early than to discover it under regulatory scrutiny or through a breach. DPOs also decide when to escalate a DPIA to regulators. Under GDPR, if a DPIA finds high risk that cannot be mitigated, the DPC must be consulted (Article 36). A savvy DPO will work hard to address risks to avoid this outcome but will not shy away from recommending consultation when warranted. Our team has experience preparing consultation packages for regulators, where we, on behalf of the client, demonstrate due diligence and seek feedback. This process can actually build trust with regulators if handled transparently.

Moreover, DPIAs are not one-off checkboxes; they should be revisited throughout a project lifecycle. XpertDPO encourages clients to treat DPIAs as living documents that are updated when something changes (like a new data element added, a new recipient, or a change in technology). We help create DPIA templates and procedures so that internal teams can carry out preliminary assessments and know when to involve the DPO. As part of specialist DPO support services, we might review clients’ DPIAs and provide a “second pair of eyes” to ensure nothing is missed. This is especially valuable in emerging areas such as AI ethics assessments, where domain-specific questions (e.g. about algorithmic bias or explainability) need to be integrated into the DPIA process. Ultimately, by leading DPIAs and advocating privacy by design, the DPO functions as an enabler: enabling the business to innovate with confidence. When product teams know the DPO will guide them through a DPIA, they are less likely to delay or avoid compliance, and they see it as part of the project’s quality assurance. This proactive stance is far better than patching privacy onto a finished product. It’s a tangible way the DPO helps “get ahead” of problems rather than react after the fact, a value XpertDPO instils in all our engagements.

Championing Data Subject Rights and a Rights-Based Practice

One of the DPO’s most important duties is to ensure that the organisation respects and facilitates the rights of individuals (data subjects). Under GDPR and related laws, people enjoy a suite of rights, to access their data, correct it, erase it, object to processing, and more, and these rights are grounded in fundamental values of privacy and personal autonomy. A DPO therefore acts as a champion of these rights internally, often designing the processes by which the company handles requests and ensuring a respectful, lawful response every time. This is sometimes referred to as a “rights-based practice” in data protection: keeping the impact on the individual’s rights and freedoms at the centre of all decisions.

At XpertDPO, we place particular emphasis on protecting the rights of vulnerable individuals, such as children, the elderly, or those with impaired decision-making capacity, because the risks to them can be greater. In fact, our Governance & Policy Lead, Dolores Martyn, was nationally recognised for her contributions to safeguarding children’s data rights, earning awards for children’s data safeguarding that reflect XpertDPO’s commitment to strong, human-centred data governance. But what does protecting vulnerable groups mean in practice for a DPO? For one, it involves ensuring enhanced safeguards and ethical scrutiny when data about vulnerable people is processed. For example, children have specific protections under GDPR (like requiring parental consent for young kids’ data in online services, per Article 8 GDPR). A DPO will advise their organisation to build compliant age-verification or consent mechanisms and to apply stricter standards of transparency that a child can understand. Moreover, DPOs must weigh the balance between protection and participation rights of children – enabling, say, a teenager’s right to have a say in how their data is used (consistent with the UN Convention on the Rights of the Child), while still ensuring their safety. In healthcare or social care contexts, when dealing with patients with limited capacity or adults under guardianship, a DPO needs to verify that the legal basis for processing is sound (perhaps relying on consent from a legal representative or on vital interest grounds) and that any data sharing with caregivers or agencies is done lawfully and minimally. These scenarios can be complex; an experienced DPO navigates applicable laws such as assisted decision-making legislation or mental health acts in tandem with GDPR.

A rights-based practice also means that when individuals exercise their rights (like filing a Subject Access Request, or asking to delete their data), the organisation responds in good faith and within legal timeframes. The DPO typically establishes internal procedures for this: how to log requests, authenticate the requester, search for data, and provide a complete and clear response. If any exemptions apply (for example, certain law enforcement data may be exempt from access), the DPO advises on applying them narrowly and consistently. In our work, we often provide DSAR support services to clients, helping them streamline handling of access requests, especially when volumes spike or when dealing with sensitive records. We found that clear guidance to staff and use of templates (for acknowledgment, response letters, etc.) greatly reduce errors in rights responses. The DPO also trains frontline employees (like customer service or HR who might receive requests first) to recognise a data rights request and route it properly. By doing so, rights requests become an opportunity to build trust with individuals rather than a compliance headache.

Another crucial area is handling complaints and queries from data subjects. DPOs frequently act as an escalation point for data protection complaints, trying to resolve issues amicably before they go to regulators. For instance, if someone complains that they can’t unsubscribe from a mailing list or that their personal data was unfairly used, the DPO investigates and mediates a resolution (perhaps ensuring the data is deleted and the cause of the lapse is fixed). This responsive, people-focused approach is not only required by law but is also core to Stuart’s values of “client trust and doing right by people”. An ethical DPO does not view individuals’ rights as annoying obligations; rather, they acknowledge that these rights are the manifestation of privacy as a human right within the organisation’s daily operations.

Finally, a rights-based ethos influences how an organisation designs its services and policies. Take, for example, the rise of AI and algorithmic processing: a DPO with a rights-oriented mindset will push for mechanisms that allow individuals to contest automated decisions or to get meaningful information about how an algorithm uses their data (GDPR’s Article 22 and transparency requirements). Similarly, in marketing activities, the DPO might advocate for ethical marketing practices e.g. not targeting vulnerable segments like children with certain ads, or respecting opt-outs diligently, even beyond what the law strictly requires, as a matter of company values. This approach feeds into sustainable compliance: doing the right thing not only checks the legal box but also upholds the organisation’s reputation and social responsibility. In regulated sectors or public service, this is doubly important; public-sector DPOs often adhere to broader public law principles and human rights frameworks. We’ve guided public agencies in adopting a “human rights impact assessment” lens alongside privacy impact assessments for initiatives, to ensure that things like equality and non-discrimination are considered. All these efforts by the DPO help ensure that data protection is not just about avoiding fines, but about respecting and reinforcing individual rights which in turn builds public trust.

Supporting High-Risk Processing and Innovative Technologies

Organisations engaged in high-risk processing activities, such as large-scale profiling, biometric data use, or systematic monitoring, rely heavily on their DPO to keep those activities in check. High-risk processing is often where law, ethics, and technology collide. A classic example is deploying facial recognition or biometric identification: inherently high-risk due to sensitivity and potential impact on individuals. A DPO’s role here starts from initial risk assessment (DPIA), as discussed, but extends to continuous oversight. The DPO will set conditions for such processing (e.g. requiring that biometric data be encrypted, access-controlled, and periodically reviewed for necessity). They will also monitor outcomes, for instance, checking if an AI system’s outputs have bias against a protected group, or if a marketing algorithm ends up profiling vulnerable consumers in ways that might be unfair or intrusive.

Another burgeoning area is the use of personal data in training AI or in big data analytics. There is often uncertainty in these domains about how GDPR principles apply. DPOs serve as translators between data scientists and legal requirements: explaining to AI developers what “data minimisation” means in practice or helping find ways to pseudonymise data sets so that innovation can proceed in a privacy-preserving manner. Under the forthcoming AI Act, certain high-risk AI systems will require a conformity assessment and risk management, which will include looking at training data governance, record-keeping, transparency, and human oversight. It’s natural that a DPO. with expertise in data governance, will be part of that compliance effort. Indeed, our team’s internal discussions have noted that “DPOs must now interpret and integrate compliance with GDPR alongside AI-specific obligations like transparency, accountability, and bias oversight”. In anticipation of the AI Act, we have been conducting AI risk assessments for clients (essentially, mini-DPIAs focused on AI systems), examining not just privacy, but also the broader ethical impacts of AI deployments. This proactive stance means the DPO helps the organisation steer AI innovation responsibly, avoiding pitfalls that could harm individuals or lead to public backlash.

High-risk processing often goes hand in hand with heightened regulatory scrutiny. Whether it’s a financial firm handling large volumes of sensitive financial data, a health-tech company processing genetic information, or a social media platform serving minors, regulators pay close attention. A seasoned DPO recognises this and will ensure the organisation is “regulator-ready.” That involves thorough documentation (policies, DPIAs, records of decisions), internal audits, and sometimes even engaging with the regulator in advance. For example, some of our public-sector clients hold biannual meetings between their DPO and the DPC’s office to discuss upcoming projects, a practice we encourage as it demonstrates transparency and can glean informal guidance. Even in the private sector, if a company is planning something novel (say, rolling out a new IoT device that collects personal data in public spaces), we might advise them to seek advisory consultation with the regulator or at least prepare a briefing in case questions arise. The DPO would typically spearhead this, framing the issues in a compliance context. This approach aligns with XpertDPO’s value of proactive regulatory engagement, ensuring our clients are not caught off guard by new enforcement trends.

Moreover, when things do go wrong, like a suspected data breach or an incident, high-risk environments benefit immensely from DPO expertise. The DPO coordinates breach investigations, assesses notification obligations, and recommends remedial actions. In high-risk sectors, regulators often inquire not just about the breach itself but how the organisation’s leadership and DPO responded. We have provided breach response support as part of our specialist DPO services, essentially being on-call for clients when an incident hits, to guide them on containment, forensics liaison, communications, and legal reporting. This “muscle memory” from handling multiple breaches across industries allows us to give calm, pragmatic advice under pressure which is a lifesaver for a small organisation facing its first serious incident. By learning from each incident (internal or industry-wide), DPOs also feed improvements back into the system: updating policies, enhancing access controls, instituting new training if human error was a cause, etc. This continuous improvement mindset is particularly crucial in high-risk processing operations where even a small lapse can have severe consequences.

Public Sector DPOs and Interagency Collaboration

Public sector organisations (like government departments, local authorities, health services, education bodies) are mandated under GDPR to appoint DPOs. These DPOs face some unique challenges: they deal with large-scale citizen data, often including sensitive categories (health, criminal, social welfare information), and they must juggle GDPR with sectoral laws and duties to provide public services. One key aspect we see in public sector is the need for interagency data sharing. Effective public services sometimes require multiple agencies to coordinate, for example, a child protection case might involve education, child & family agency, police, and healthcare providers. However, each agency has its own legal obligations and constraints on sharing data. The DPO in a public agency thus must be well-versed in not only GDPR, but also any enabling legislation or obstacles to sharing. As noted earlier, gaps in legal frameworks can create friction which is something a public-sector DPO often has to flag and work around by establishing interim agreements or protocols. Public-sector DPOs also handle Freedom of Information (FOI) or Access to Information requests that intersect with personal data. While FOI is separate from GDPR, they often overlap (e.g., someone requests records that include personal data of others and the DPO might need to advise on redactions under data protection principles).

Another point for public bodies is demonstrating compliance transparently. The public rightfully expects that government and services handle data with a high level of care. DPOs in public bodies might publish Data Protection Impact Assessments summaries for significant projects or maintain publicly accessible records of processing (some countries encourage this). They also tend to provide more extensive privacy notices and engage in public consultations regarding new programs that involve personal data. For example, if a city council introduces CCTV in a new area, the DPO might oversee a public consultation or communications plan explaining the privacy implications and safeguards, to build public trust. This goes hand in hand with rights-based practice; the public sector often serves vulnerable populations (e.g., recipients of social services), so the DPO’s role in protecting those individuals’ data and ensuring fair processing is critical.

XpertDPO has substantial experience acting as outsourced DPO for public-sector clients, including departments, regulatory bodies, and healthcare agencies. We understand that public service ethos must align with data protection. One concrete example: in one of our engagements, a government agency needed to launch an interagency platform for case management involving minors. We provided an outsourced DPO who worked closely with all stakeholders to design a governance model: drafting Data Sharing Agreements between agencies, defining access controls based on roles, and setting up a joint oversight committee for the platform’s data use. The DPO ensured that child safeguarding remained the paramount concern, echoing the DPC’s guidance that child welfare can justify data sharing, but also implemented strict audit logs and consent procedures where appropriate to protect privacy. This balanced approach allowed the agencies to cooperate more freely, knowing that clear rules were in place and that a DPO was monitoring compliance continuously.

Public bodies also often find themselves as pioneers in responding to new laws (since governments implement laws like NIS2 or sectoral rules). Their DPOs, therefore, might be among the first encountering how those intersect with GDPR. For instance, under the EU’s Law Enforcement Directive (for police data) or regulations like the forthcoming EU Child Sexual Abuse Regulation (which may require certain content scanning), DPOs in those authorities must carefully reconcile privacy with other legal mandates. It’s a delicate position but one where the DPO’s balanced view and ethical stance are invaluable. As Stuart frequently reminds our public-sector clients, transparency and honesty about what you’re doing with data goes a long way in maintaining public trust. A DPO will thus encourage public bodies to be open about their data processes, report breaches to authorities and affected people promptly, and remediate issues comprehensively. That level of openness can be difficult in practice, but it reinforces the public’s confidence that someone (the DPO) is independently watching over how their information is handled.

XpertDPO’s Ethos: Trust, Transparency, Education, Sustainable Compliance

Throughout all these facets of the DPO role, certain core values shine through, values that XpertDPO has embraced from day one. Our philosophy is built on ethical practice, client trust, and pragmatism. We believe a DPO must be more than a rule-enforcer; they must be a trusted advisor who helps the organisation do the right thing, not because of fear of penalties, but because it’s integral to the business’s integrity and success.

  • Trust and Transparency: A DPO’s effectiveness hinges on trust, both the trust they build with internal stakeholders (so people come to them with issues and involve them early) and the trust they help cultivate with customers or the public. We ingrain transparency as a habit: being honest about compliance gaps, giving management a realistic picture of risks, and communicating openly with data subjects. Stuart insists on providing realistic, achievable outcomes for clients, even if it means delivering hard truths. In practice, this might mean telling a client that a project is too risky unless significant changes are made or acknowledging to a regulator when we don’t have all the answers but are committed to improvement. By not overpromising (“we do not promise 100% compliance” is a mantra), we set attainable goals and then deliver on them. This honesty nurtures long-term relationships, where clients trust our advice because they know we are not hiding inconvenient facts or sugarcoating compliance obligations.
  • Education and Empowerment: Education is at the heart of XpertDPO’s mission. We see every engagement as an opportunity to raise our clients’ understanding of data protection. A great DPO doesn’t hoard knowledge; they share it widely, training staff from the C-suite to the front line. We offer formal training modules (through our XpertAcademy, for example) and informal mentorship. Our approach with clients is very collaborative: rather than just handing down a policy, we explain the why, why this policy matters, how it protects the business and individuals. This aligns with Stuart’s view that educating clients fosters compliance that is pragmatic and sustainable. An example is how we handle a DPIA with a client’s team: we’ll involve their project managers or IT leads in the process, teaching them how to identify privacy risks, so next time they already have that mindset. Over time, this builds an internal culture where people become privacy champions in their own right, reducing reliance on the DPO for every small decision. Education also means staying educated ourselves, our DPOs continuously update their skills (be it getting AI governance certifications or attending cybersecurity forums) so that we can expertly guide clients through new challenges.
  • Pragmatism and Sustainable Compliance: We define sustainable compliance as compliance that aligns with business objectives and can be maintained long-term. It’s easy to draft a thick policy document; it’s harder to implement practices that actually stick. Our pragmatic ethos means we tailor solutions to each client’s reality, their industry, size, risk appetite, and resources. For small businesses, for instance, we might implement lean frameworks: a concise RoPA and a simple breach checklist might be more practical than enterprise-grade GRC tools. For larger entities, scalability and integration with existing processes is key so we might embed privacy controls into their IT change management or procurement workflows. In all cases, we avoid ‘compliance theatre’ and focus on measures that truly reduce risk. A telling example from our proposals: “We do not conduct an academic assessment that merely identifies flaws; our objective is to provide practical guidance and leadership to implement a robust, scalable GDPR compliance framework”. We live by this in DPO service delivery. The ultimate test of sustainable compliance is whether the organisation can handle data protection in stride with its growth and changes. Seeing our long-term clients pass regulatory audits, or swiftly integrate privacy in a new product launch, are moments that validate this approach.
  • Independence and Integrity: A DPO must have the courage of their convictions. Independence is legally protected (GDPR says DPOs shouldn’t be instructed on their advice or penalised for their work), and we fiercely guard that independence, especially as outsourced DPOs. We make it clear in engagements that while we operate as part of the client team, our first duty is to data protection law and ethical practice. This sometimes means giving unwelcome news or pushing back on risky initiatives. But by documenting our advice and the rationale, and by communicating effectively, we maintain credibility. Over time, even sceptical executives see the wisdom in the DPO’s perspective because it saves them from compliance disasters and builds a robust reputation. Stuart’s own integrity sets the tone where he often says his goal is to “deliver on promises and hold himself accountable,” and he expects the same of each team member. We also ensure no conflicts of interest in our role, for example, if we act as DPO we do not perform conflicting consulting that would undermine our impartiality. XpertDPO’s outsourced DPO service guarantees an independent DPO with no conflict of interest for our clients which is a strong advantage over assigning someone internally who wears multiple hats.

Outsourced DPO Services vs. Specialist DPO Support

Depending on an organisation’s needs, XpertDPO offers two primary service models that anchor to the DPO role: Outsourced DPO services and Specialist DPO support for in-house teams. Understanding these options can help in choosing the right approach to data protection leadership.

  • Outsourced DPO Services: This is a full-service solution where our team acts as your organisation’s named DPO, fulfilling all the tasks and responsibilities that an internal DPO would, from answering employee questions to representing the company to regulators. Outsourcing is ideal for organisations that need expert DPO oversight but do not have a qualified full-time DPO in-house. This could be a small or medium enterprise who can’t justify a senior hire, or a large entity in a high-risk sector that prefers an external specialist for independence. XpertDPO has developed tiers of outsourced DPO service to fit different contexts. For example, XpertDPO Shield is our strategic full-service DPO offering for regulated or high-risk organisations, whereas XpertDPO Assist provides a fractional DPO service tailored for SMEs that need flexibility. In both cases, clients get direct access to experienced professionals who integrate with their team. As our proposal documents state, “Our service is a practical and cost-effective solution for organisations lacking the requisite expertise to fulfil their DPO duties under the GDPR and other relevant laws across the globe”. When you engage an outsourced DPO from XpertDPO, you are tapping into a team of experts, not just one person. This means if a particular issue arises (say, a complex cybersecurity question under DORA, or a bespoke contract clause for data transfers), our DPO can draw on in-house specialists with that knowledge. It’s like having a whole compliance department at your service. Indeed, one of the benefits our clients value is that our team’s combined expertise is “far wider and deeper than would be possible for a smaller, less experienced team to maintain”. We also bring proven templates and tools (for policies, RoPAs, training etc.) so that the organisation doesn’t have to reinvent the wheel. And because outsourced DPO is delivered as a service, clients avoid the overhead costs of a full-time hire (salary, continuous training, benefits) while still getting top-tier support. Fundamentally, our outsourced DPO services allow a company to have an independent, highly-qualified DPO at a fraction of the cost and with greater value, often providing “greater value for this key role than is possible from a single internal employee”. By outsourcing, organisations can immediately elevate their compliance posture and demonstrate to stakeholders that privacy is being handled by seasoned pros.
  • Specialist DPO Support (for In-House DPOs): Not every organisation wants to fully outsource the DPO role, many have appointed an internal DPO or point person (like a Chief Privacy Officer or a legal/compliance manager wearing the DPO hat). For these clients, XpertDPO offers DPO Support services, which effectively act as a “DPO’s DPO.” This is an on-demand advisory service where our experts back up and reinforce your in-house DPO. Think of it as giving your internal DPO access to a bench of mentors and specialists they can lean on. Even the most experienced solo DPOs benefit from a sounding board, because no one person can know everything or have infinite time. Our support covers things like: reviewing tricky DPIAs, advising on novel issues (e.g. how to handle an AI deployment or a cross-border data transfer challenge), assisting with regulator correspondence or audit preparation, providing breach response help, and doing documentation or policy reviews as needed. Crucially, this support is confidential and maintains the DPO’s independence, we operate behind the scenes to empower the DPO, not to override them. In fact, one selling point is that we help strengthen the DPO’s hand internally. For example, if an in-house DPO is facing pushback from a department about a recommendation, they can call on our independent analysis to validate their stance, sometimes just having an external expert confirm the risk assessment can persuade management. As our service description puts it, “Support isn’t interference. Our role is advisory; we help you protect your independence by offering neutral, well-reasoned input when internal pressure builds.”. We also provide practical aids like templates, checklists, and access to our training modules so the in-house DPO can continuously upskill. This model is highly flexible: some clients engage us on a retainer for a set number of hours each month, others on a project basis (e.g. help with a major GDPR re-audit), and some just for ad-hoc second opinions. The overarching goal is to ensure the internal DPO is never left isolated or overwhelmed. They don’t have to guess at solutions in a vacuum or feel alone facing a regulator, we’re essentially a safety net and a knowledge reservoir. Our tagline “We’re the DPO for your DPO” truly encapsulates it. By choosing specialist DPO support, organisations get the best of both worlds: their internal person remains the accountable figure (maintaining continuity and internal ownership), but that person is bolstered by expertise that keeps them at the cutting edge of compliance expectations.

Both outsourced DPO and support services exemplify XpertDPO’s leadership in DPO delivery. We are among the few providers that have a dedicated support model for in-house DPOs, recognising that even companies with a privacy office can need external perspective to tackle complex challenges. And for those outsourcing entirely, our ability to plug into the client’s environment and act as a trusted insider, while maintaining an outsider’s objectivity, sets us apart. It’s worth noting that XpertDPO’s approach has been honed over hundreds of engagements and diverse sectors, meaning we come prepared with insight into best practices across industries. Our team’s collective 135+ years of specialist expertise and experience with 300+ organisations worldwide give us unmatched depth. When clients partner with us, they aren’t just meeting a compliance requirement, they are gaining a competitive edge in data protection. As one client’s Head of Legal put it, “With XpertDPO behind us, our risk is lower, our workload is lighter, and our board is confident. That’s real value.”

Conclusion

The role of the Data Protection Officer has transformed from a compliance cost-centre into a strategic asset for organisations. In a world of escalating data risks, rapid regulatory change, and increased public scrutiny, the DPO stands at the intersection of law, ethics, and technology, guiding companies to not only comply, but to do so in a way that earns trust and builds value. From maintaining meticulous records and guarding individuals’ rights, to steering AI governance and cybersecurity alignment, today’s DPO is both navigator and enabler. They ensure that innovation can proceed with privacy safeguards, and that an organisation’s pursuit of data-driven growth never loses sight of fundamental rights and freedoms.

XpertDPO is proud to be at the forefront of this field, delivering DPO services and support that exemplify expertise and ethical practice. We view ourselves as partners in our clients’ success, when our clients can confidently say their data is handled with integrity, that’s our success too. By embracing Stuart’s values of trust, transparency, education, and sustainable compliance, we’ve helped organisations large and small transform regulatory requirements into opportunities for improvement. Our Outsourced DPO Services provide organisations with immediate access to seasoned leadership in data protection, while our Specialist DPO Support empowers in-house privacy officers to excel and stay ahead of emerging challenges. In both cases, our approach is deeply pragmatic and human-centred: we don’t deal in scare tactics or one-size-fits-all templates, but in tailored solutions and honest counsel grounded in real-world experience.

Ultimately, the DPO’s role is about safeguarding trust, the trust of customers, employees, and the public that their information is respected and safe. In the coming years, as regulations like the AI Act and NIS2 take effect and new technologies continue to emerge, the DPO will be even more pivotal in organisations’ governance frameworks. Those that invest in a strong DPO function, through capable people and the right support, will be well-positioned to thrive in this environment. They will be the ones who can confidently innovate and collaborate, knowing their compliance foundation is sound. As this article has illustrated, the DPO is far more than a compliance checkbox; done right, it’s a role that integrates legal insight, ethical oversight, and strategic vision. That is the role XpertDPO has championed since our inception, and the role we continue to elevate through our services.

If your organisation is looking to strengthen its data protection leadership, consider how an outsourced DPO or specialist support from XpertDPO could provide the expertise, clarity, and confidence you need. By anchoring your compliance efforts to experienced guidance, you not only meet your obligations, you build a culture of privacy and trust that underpins long-term success. In data protection, as in business, knowledgeable guidance and principled action make all the difference. With the right DPO partner, you can navigate whatever lies ahead, secure in the knowledge that your compliance journey is on a sustainable, ethical path.

Understanding Minimal and Limited Risk under the EU AI Act

A Practical Guide for DPOs and In-House Legal Teams

Artificial intelligence has quietly become part of everyday work. From productivity assistants to document summaries and email suggestions, most organisations already use AI without realising it. These technologies bring efficiency, but they also raise important questions for data protection and compliance professionals: how do you manage accountability, explainability, and transparency without overburdening your governance processes?

The EU AI Act offers a clear framework for doing exactly that. It classifies AI systems according to the level of risk they pose to people’s rights or safety. The EU AI Act distinguishes between unacceptable, high, and certain limited-risk systems. The term ‘minimal or no risk’ is commonly used to describe AI systems that fall outside the specific obligations of the Act. For most organisations, the focus will be on the last two categories. They cover the vast majority of AI systems in day-to-day use, tools that enhance productivity rather than make high-stakes decisions.

This article explains what minimal and limited risk systems are, what the EU AI Act expects of organisations that use them, and how DPOs and legal teams can embed proportionate AI governance into existing compliance frameworks.

Understanding the EU AI Act’s Risk Approach

The Act’s design is rooted in proportionality. It does not impose heavy regulation on every AI tool. Instead, it scales obligations according to the potential impact on people.

  • Unacceptable risk systems are banned altogether. These include manipulative or exploitative AI such as social scoring or subliminal techniques.
  • High risk systems are strictly regulated and typically found in sectors such as health, employment, education, credit scoring, or law enforcement.
  • Limited risk systems require transparency measures so that users know when they are engaging with AI.
  • Minimal risk systems carry no specific legal obligations beyond general laws such as the GDPR and consumer protection rules.

This tiered approach is important because it allows innovation to continue while protecting fundamental rights. For most DPOs and legal teams, the challenge is to translate those tiers into practical governance actions that fit within existing processes rather than duplicating them.

Why Minimal and Limited Risk Matter Strategically

It would be easy to treat these categories as purely technical or compliance-driven. In reality, they sit at the heart of strategic governance.

Accurately classifying AI systems defines how organisations can innovate safely. It helps determine when a full risk assessment is required, when lighter documentation will suffice, and how to prioritise oversight. More importantly, it demonstrates to regulators, partners, and customers that the organisation understands its responsibilities and has a defensible approach to accountability.

This is not simply about avoiding fines. Clear classification also builds trust and confidence among staff and clients. When people know how and why AI is being used, the organisation’s reputation for transparency and ethical practice grows stronger. That is particularly valuable in markets where trust is a differentiator, such as healthcare, finance, and technology.

Minimal Risk AI: Low Impact, High Accountability

Minimal risk AI refers to systems that present little or no potential to affect individuals’ rights or safety. They typically automate small, low-stakes tasks, often in the background, and do not involve profiling or decision-making.

Common examples include:

  • Grammar and spelling assistants.
  • Autocomplete and predictive text.
  • Search or document retrieval tools.
  • Spam filters and simple categorisation algorithms.

The EU AI Act imposes no direct obligations on these systems, but good governance remains essential. Accountability underpins both the Act and the GDPR. DPOs should ensure that minimal risk systems are visible in governance registers and can be explained if questions arise.

Practical steps for managing minimal risk AI:

  • Keep a short internal note in your DPIA or processing record identifying the system, its function, and your rationale for minimal risk classification.
  • Record the supplier, model version, and location of any data processed.
  • Periodically review the system to ensure that its functionality has not evolved into areas such as profiling or analytics that might alter the risk level.

Minimal risk does not mean no oversight. A one-page record of your reasoning is often enough to show accountability, but it is also a valuable signal of organisational maturity.

The difference between minimal and limited risk is not about technical complexity but about human impact. Once AI begins interacting with people or generating information that could shape perceptions or decisions, transparency becomes the key dividing line.

Limited Risk AI: Where Transparency Becomes the Safeguard

Limited risk systems are those that interact with users directly or generate synthetic content but are not used in sensitive or high-risk contexts. Their primary risk lies in misunderstanding, in that people may not realise they are engaging with AI or may over-rely on outputs.

Examples include:

  • Chatbots and virtual assistants.
  • Tools that generate text, audio, or images.
  • Meeting transcription or summarisation services.
  • Productivity assistants that draft, summarise, or recommend actions.

For limited risk AI, the EU AI Act focuses on transparency obligations. Transparency obligations for certain AI systems, such as chatbots, emotion recognition, and systems generating synthetic content, are set out in Article 50 of the AI Act. Users must:

  • Be clearly informed that they are interacting with AI.
  • Users must be informed that content has been generated or manipulated by AI.
  • Be able to identify AI-generated or synthetic content from notifications.
  • Understand the capabilities and limitations of the system.

The goal is not to stop organisations using these tools, but to make sure people know when AI is at work and can interpret its outputs appropriately.

Practical steps include:

  • Maintain a register of all limited risk AI systems with notes on their transparency measures.
  • Confirm that user interfaces display clear AI notices or indicators.
  • Keep vendor documentation that demonstrates compliance with the EU AI Act’s transparency articles.
  • Incorporate transparency records into your DPIA or a dedicated AI governance appendix.

Transparency is the safeguard for limited risk AI. When users understand when AI is involved, how it works, and what it cannot do, most of the compliance risk disappears.

A Practical Example: Microsoft 365 Copilot

Microsoft 365 Copilot illustrates limited risk AI in action. Microsoft 365 Copilot would typically fall within the limited-risk category when used for general productivity tasks, but classification may change depending on context (for example, HR decision-making could raise the risk level). It operates inside familiar tools such as Word, Outlook, Excel, and Teams, using the organisation’s existing data. Copilot is not creating a new dataset, but it changes how that data is accessed and used.

DPOs can approach Copilot systematically:

  1. Map the data flow. Identify what sources Copilot draws from. Most will already be governed under GDPR.
  2. Determine the risk tier. Copilot’s summarisation and drafting features fall within the limited risk category.
  3. Ensure transparency. Provide staff training and internal guidance making it clear that Copilot uses AI and that outputs require human review.
  4. Verify supplier compliance. Keep copies of Microsoft’s documentation on Copilot’s AI model, transparency commitments, and security measures.
  5. Reassess periodically. If Copilot is later used in HR or decision-making contexts, reclassify it as high risk, if applicable, and expand governance accordingly.

Copilot is a good example of how limited risk AI sits inside existing compliance frameworks. The AI layer does not replace GDPR obligations; it adds a transparency layer on top.

Managing Vendors and Third-Party AI

AI governance does not end with in-house systems. Third-party vendors and cloud providers are increasingly embedding AI functionality into standard software packages. DPOs need to know what these systems are doing and how they fit into the organisation’s risk profile.

Practical supplier governance steps include:

  • Updating vendor due diligence questionnaires to include AI-specific questions.
  • Requiring suppliers to disclose whether their systems use AI and, if so, how they classify it under the EU AI Act.
  • Ensuring contracts contain obligations for transparency and notification of material changes in functionality.
  • Reviewing third-party privacy notices to check alignment with your organisation’s transparency commitments.

This supplier awareness is critical because many limited risk systems will enter the organisation indirectly through updates or integrated features. A question as simple as “Does this system now use AI?” should become part of routine vendor management.

Combining AI Risk Assessment with DPIAs

AI risk assessments and GDPR DPIAs often apply to the same technology. Running them separately wastes time and risks inconsistency. A combined assessment provides a single, coherent record of compliance.

A practical two-in-one approach looks like this:

  1. Begin with your existing DPIA template.
  2. Add an AI section that determines the system’s risk tier under the EU AI Act.
  3. Cross-reference overlapping controls, such as fairness, accuracy, and human oversight.
  4. Record your rationale for classification and any transparency measures applied.

This combined model makes your documentation more efficient and defensible. It also shows regulators that the organisation is integrating AI governance within established privacy processes rather than treating it as a siloed exercise.

You do not need separate compliance tracks for AI and data protection. A single integrated DPIA with an AI addendum provides a clear, practical, and efficient approach to governance.

Building a Culture of Transparency and Awareness

AI compliance is not just a technical task. It depends on awareness across the organisation. Many risks arise not from deliberate misuse but from lack of understanding about where AI is operating.

DPOs can help by:

  • Training staff to recognise when systems might use AI and how to disclose it.
  • Including AI awareness in induction and refresher compliance training.
  • Providing a clear reporting route for staff who introduce new AI tools or discover them within existing systems.
  • Encouraging open discussion about AI ethics and bias without creating a culture of fear.

A culture of awareness ensures that AI deployments are surfaced early, documented properly, and reviewed for transparency obligations before they create regulatory problems.

The Case for Public AI Transparency Policies

Every organisation using AI should have a concise AI Transparency Policy available to the public. While not required by the EU AI Act, publishing a short AI transparency statement is a good practice for accountability and public trust. This policy communicates the organisation’s position, shows leadership, and demonstrates accountability in a visible way.

A strong policy should:

  • Outline what types of AI systems are used and for what purpose.
  • Describe how each category is governed and classified under the EU AI Act.
  • Explain how transparency and fairness are maintained.
  • Provide a contact route for questions or concerns.

For user-facing services, an AI indicator icon or short disclosure note linking directly to the policy can make transparency tangible. This approach mirrors cookie banners and privacy notices, ideally short, accessible, and visible.

Transparency builds confidence. A clear policy and visible AI indicator show that the organisation is proud of its responsible practices rather than hiding them in small print.

Questions to Ask in Governance and Board Meetings

Board and compliance meetings are where accountability becomes visible. Directors and senior managers do not need to be AI experts, but they should know how to ask the right questions. These conversations build oversight and reinforce the organisation’s duty to monitor risk.

Useful questions include:

  • Do we have a current and published AI Transparency Policy?
  • Is there an AI systems register, and who maintains it?
  • What models or third-party tools are currently in use across our environment?
  • Do we ask suppliers to confirm whether their products include AI components or use third-party models?
  • Have our DPIA templates been updated to include AI classification and transparency checks?
  • Who reviews risk classifications and re-evaluates systems as they evolve?
  • How do we communicate AI use internally to staff and externally to clients or regulators?

Governance is not about knowing every detail of how AI works. It is about asking questions that reveal whether proper control and understanding are in place.

Regularly reviewing these questions in board meetings keeps AI governance aligned with other corporate risks. It also creates an audit trail showing active oversight which is a powerful indicator of accountability.

Roles and Accountability in AI Oversight

AI governance often sits across several functions. DPOs manage data protection, CISOs handle security, legal teams address contractual risk, and IT manages deployment. For many organisations, the best approach is to establish a cross-functional AI governance group.

This group should:

  • Meet periodically to review the AI systems register and any new implementations.
  • Ensure consistent interpretation of risk classification.
  • Align AI oversight with broader risk frameworks such as ISO 27001, NIST AI RMF, or internal ethics committees.
  • Report key findings to senior management and the board.

A shared model of accountability prevents gaps and ensures that AI risks are addressed from both ethical and operational perspectives.

Looking Ahead: The Future of AI Governance

The EU AI Act is the first comprehensive AI regulation, but it will not be the last. Global frameworks are converging. The NIST AI Risk Management Framework, OECD principles, and upcoming UK AI Assurance Guidance all reinforce similar ideas: risk-based classification, transparency, human oversight, and accountability.

Organisations that build governance structures now, even for minimal and limited risk AI, will be well positioned as new standards evolve. The European Commission’s AI Office, expected to oversee implementation, will likely emphasise documentation and transparency as core indicators of compliance maturity.

Future audits may ask to see your AI systems register, transparency policy, and evidence of staff awareness. Starting small, with minimal and limited risk systems, ensures that governance habits are already in place when oversight becomes more formal.

Bringing It All Together

The EU AI Act provides an opportunity, not a burden. For most organisations, compliance will not mean complex technical changes, but thoughtful governance and clear communication. The EU AI Act entered into force on 1 August 2024, with most obligations, including transparency rules for limited-risk AI, becoming applicable from 2 August 2026.

By classifying systems accurately, integrating AI risk assessment into DPIAs, maintaining a public transparency policy, managing supplier disclosures, and embedding awareness at all levels, DPOs and legal teams can meet the requirements confidently.

Minimal and limited risk AI may seem low on the regulatory ladder, but they represent the foundation of responsible AI use. Transparent documentation, consistent oversight, and honest communication will not only meet compliance expectations but also strengthen trust, with clients, employees, and regulators alike.

Compliance done the right way is not about doing everything; it is about doing the right things properly, documenting them clearly, and being open about how technology is used. That is how ethical organisations turn regulation into a mark of integrity.

Data Protection Requirements in Clinical Trials

Safeguarding Data Protection and Privacy in Research: Data Protection Impact Assessments and the Clinical Trials Landscape

Clinical trials form the cornerstone of biomedical progress. They provide the evidence base for new therapies, diagnostics, and medical devices, all while involving some of the most sensitive categories of personal data. In an era of increasingly decentralised studies, complex data flows, and cross-border collaboration, the governance of personal data in clinical research has become as vital as the scientific protocols themselves. This reality places data protection and in particular, the requirement to conduct Data Protection Impact Assessments (DPIAs) at the heart of ethically and legally robust trials.

Across the European Union and European Economic Area, the General Data Protection Regulation (GDPR) sets a clear expectation: where processing is likely to result in a high risk to individuals’ rights and freedoms, a DPIA is not merely advisable — it is mandatory. The processing of special category data, such as health-related information, triggers heightened scrutiny. In clinical trials, this scrutiny is more than procedural. It touches on participant autonomy, data sovereignty, and the fundamental trust between the research community and society.

This article explores the DPIA obligation in the context of clinical trials, drawing from authoritative guidance developed by Ireland’s National Clinical Trials Oversight Group (NCTOG) and supported by the Irish Data Protection Commission (DPC). It situates these responsibilities within the broader framework of EU data protection law, while also reflecting the operational realities faced by sponsors, investigators, ethics committees, and Data Protection Officers (DPOs).

A Regulatory Imperative, Not a Formality

At its core, a DPIA is a structured process that enables organisations to identify, assess, and mitigate risks associated with personal data processing. It embodies the GDPR’s principle of accountability and operationalises the concept of privacy by design. While DPIAs may take different formats depending on the nature and scale of processing, their objective remains consistent: to anticipate data protection risks before they materialise, and to document the rationale behind the chosen safeguards.

Clinical trials typically involve the systematic collection and analysis of data concerning participants’ health, genetic information, lifestyle, and sometimes even biometric or behavioural data. The processing often occurs over extended periods, involves multiple entities across jurisdictions, and uses advanced technologies such as electronic data capture systems, cloud-based trial management platforms, and artificial intelligence tools for statistical analysis or remote monitoring. Each of these dimensions amplifies the potential risk to data subjects.

Under Article 35(3) of the GDPR, a DPIA is required in situations involving the large-scale processing of special category data or systematic monitoring of individuals. These criteria are routinely met in the design and conduct of clinical trials. It is therefore essential for sponsors and sites to treat the DPIA not as a tick-box requirement, but as an embedded part of the trial planning process.

Defining Roles: Controllers, Processors and Joint Arrangements

A fundamental step in assigning DPIA responsibility is determining the role of each participating organisation. The GDPR distinguishes between data controllers, who determine the purposes and means of processing, and data processors, who act on a controller’s documented instructions.

In the clinical trial domain, the sponsor is frequently the entity that defines the protocol, determines the data that will be collected, and decides how it will be analysed. In such cases, the sponsor is clearly acting as a data controller. If the trial site which is typically a hospital or academic institution, simply follows the sponsor’s protocol and manages data on the sponsor’s behalf, it functions as a processor.

However, not all arrangements are so straightforward. Increasingly, trial sites participate in protocol design, select subsets of data to retain locally, or use the data for secondary research. Where decision-making around data processing is shared, the sponsor and site may be deemed joint controllers under GDPR (Art. 26). This designation carries specific obligations, including the need for a transparent joint controller agreement and a clear delineation of responsibilities toward data subjects.

In both the controller–processor and joint controller scenarios, the responsibility for conducting a DPIA lies with those determining the purposes and means of processing. Where roles are shared, the parties must reach a practical and lawful arrangement for completing the DPIA. The NCTOG guidance confirms that local hospital DPOs and ethics committees are not responsible for the DPIA, although they may have supporting roles or be consulted during the process.

 

Responsibility / Factor Sponsor as Controller Trial Site as Processor Joint Controllers (Sponsor & Site) Independent Controllers
Determines purposes and means of processing ✔️ ✔️ (jointly) ✔️ (separately)
Initiates and conducts DPIA ✔️ ✔️ (collaborative or delegated) ✔️ (each independently)
Primary accountability under GDPR ✔️ ✔️ (shared) ✔️ (individual)
Requires joint controller or processor agreement ✔️ (Processor Agreement) ✔️ ✔️ (Joint Controller Agreement)
Consults with DPO before trial begins ✔️ ✔️ (both) ✔️ (each separately)
Handles data subject rights ✔️ ❌ (unless instructed) ✔️ (must coordinate) ✔️ (each controller)
Provides data protection notice ✔️ ✔️ (joint or coordinated) ✔️ (individually)
Defines legal basis and mitigates risk ✔️ ✔️ (shared or divided) ✔️ (each independently)

 

The Ethics Committee Is Not the DPO

One of the more persistent misconceptions in the clinical trial landscape is the belief that ethics committee approval substitutes for a DPIA. This confusion stems from the fact that both processes occur early in the study lifecycle and are designed to safeguard participants’ interests. However, they are fundamentally distinct.

An ethics committee evaluates the clinical rationale, safety considerations, and integrity of the informed consent process. It assesses whether the proposed research design is proportionate, scientifically valid, and ethically sound. Data protection may be mentioned, but it is not the central focus. In contrast, a DPIA scrutinises the data processing elements of the project. It examines the lawfulness of processing, the compatibility of purposes, data minimisation strategies, storage limitations, security measures, and the extent to which data subjects can exercise their rights.

The GDPR is explicit on this point. DPIA obligations exist independently of other sector-specific approvals. Ethics committees are not tasked with reviewing DPIAs, and a trial may require additional safeguards beyond those imposed by ethics boards. The distinction must be respected to ensure that data protection responsibilities are not overlooked or fragmented.

DPIAs in Practice: Timing, Consultation, and Iteration

A well-conducted DPIA begins well before the first participant is enrolled. It should form part of the initial feasibility and risk assessment stages of the trial, when data flows are being designed and operational partners are being selected. Delaying the DPIA until after key decisions are made diminishes its value and can expose the sponsor to unnecessary regulatory risk.

The GDPR encourages the consultation of a DPO where one has been appointed. In clinical research, this consultation is not only legally prudent but practically beneficial. DPOs can bring critical insights regarding data retention schedules, international transfers, lawful bases for processing under both Articles 6 and 9, and mechanisms for handling data subject rights. Where multiple jurisdictions are involved, local DPOs or legal experts may be consulted to address national derogations or ethics frameworks.

The DPIA should not be treated as a static document. Clinical trials often evolve through protocol amendments, new study arms, or technology upgrades. Each of these changes may affect the data processing landscape. Sponsors should revisit and, where necessary, revise their DPIAs in response to these developments. This iterative approach aligns with the accountability principle and positions the DPIA as a living instrument rather than a bureaucratic artefact.

Distinguishing Medical Consent from GDPR Consent

In the context of clinical trials, the concept of “consent” carries distinct legal and ethical meanings depending on the framework in which it is applied. One of the most frequent sources of confusion, both among research professionals and participants, is the assumption that informed medical consent automatically satisfies the requirements for valid consent under data protection law. However, this is not the case.

Medical or clinical consent relates to a person’s agreement to participate in a clinical trial or medical intervention. It is governed by ethical and clinical standards, typically overseen by ethics committees and national legislation. This form of consent ensures that participants understand the purpose, procedures, potential risks, and benefits of the study, and that their decision to participate is voluntary, informed, and free from coercion.

By contrast, GDPR consent is one of several legal bases available for processing personal data under Article 6 of the General Data Protection Regulation. When special category data such as health information is involved, as it nearly always is in clinical trials, Article 9 also applies, requiring a separate condition to legitimise processing. GDPR consent is defined by a strict set of criteria: it must be freely given, specific, informed, unambiguous, and capable of being withdrawn at any time, without detriment.

These differences have practical consequences. While informed consent is ethically indispensable for trial participation, it may not always be the appropriate or reliable legal basis for processing personal data under GDPR. This is especially true in scenarios where the data processing is essential to comply with legal obligations, to perform a task in the public interest, or to fulfil the sponsor’s legitimate interests, provided that such interests are not overridden by the rights and freedoms of the participant.

Moreover, GDPR consent must be separable from clinical consent. Participants must be able to decline or withdraw their data processing consent without necessarily withdrawing from the trial itself, which is not always feasible in practice. As a result, many sponsors and ethics boards prefer to rely on alternative lawful bases such as public interest in the area of public health or scientific research purposes under Article 9(2)(j), supported by appropriate safeguards such as pseudonymisation, data minimisation, and robust governance controls.

Ultimately, it is crucial to treat medical and data protection consents as distinct instruments serving different legal and ethical purposes. DPIAs offer a valuable opportunity to document this distinction, justify the choice of lawful basis for data processing, and ensure that participant-facing materials clearly explain the difference. This approach not only enhances compliance but also reinforces transparency and respect for the individuals at the heart of the research.

Documentation, Transparency and Responding to Challenges

The value of a DPIA lies not only in its risk analysis but also in its documentation. Regulatory authorities may request evidence that the DPIA was completed and that appropriate mitigation measures were implemented. In high-risk cases where the DPIA indicates that the processing would still result in significant residual risks, the controller must consult the relevant supervisory authority before proceeding. While such consultations are rare in clinical trials, sponsors must be prepared to demonstrate that they considered the option if applicable.

Transparency is equally important. While the DPIA itself is not typically published, its outcomes may be summarised in participant information leaflets or data protection notices. These summaries should strike a balance between accessibility and accuracy, enabling participants to understand how their data will be used, protected, and governed.

Responding to data subject requests whether for access, rectification, or objection is another area where the DPIA can prove useful. It should outline the procedures for managing such requests, especially where joint controller arrangements are in place. Clarity on responsibilities can help avoid delays and ensure consistent communication with participants.

Supervisory Oversight: Ireland’s DPC and Broader EU Implications

The NCTOG guidance, reviewed and approved by Ireland’s Data Protection Commission, offers a structured and practical interpretation of DPIA responsibilities in clinical trials. While it reflects the Irish regulatory environment, its core principles are aligned with guidance from the European Data Protection Board (EDPB) and are applicable across the EU.

Sponsors operating multinational trials should be alert to national variations in ethics oversight, data protection enforcement, and health legislation. Some Member States impose additional conditions on processing health data, particularly in the context of public health or scientific research. These conditions may affect the DPIA content or consultation processes. Engaging with local DPOs and legal counsel is therefore essential in cross-border settings.

From a regulatory risk perspective, supervisory authorities increasingly expect organisations to demonstrate not only formal compliance but substantive accountability. A DPIA that is generic, outdated, or disconnected from operational practice will not withstand scrutiny. Conversely, a well-reasoned and evidence-based DPIA can serve as a shield in the event of complaints or inspections.

Looking Ahead: Embedding DPIAs in Research Culture

The ultimate goal of data protection law is not to obstruct research but to enable it in a way that respects the dignity and autonomy of individuals. In this sense, DPIAs are not a burden but a tool of empowerment. They prompt researchers to consider the ethical and legal dimensions of data use at every stage of the trial. They foster interdisciplinary collaboration between scientific, legal, and technical teams. They provide transparency and reassurance to participants who entrust their data to the research enterprise.

For sponsors and investigators, this means moving beyond minimal compliance and toward a culture of proactive privacy management. For DPOs, it means engaging with research teams early and often, providing pragmatic advice that supports both innovation and data protection. For oversight bodies and ethics committees, it means clarifying their respective roles and encouraging alignment across governance processes.

As the clinical trials landscape becomes more digital, decentralised, and data-driven, the importance of DPIAs will only grow. By investing in robust, context-sensitive DPIAs, the research community can strengthen its social license, mitigate legal risks, and uphold the foundational values of trust, transparency, and respect.

Who We Help: Data Protection & Cybersecurity Services Across Key Sectors

At XpertDPO, we partner with organisations across a diverse range of industries to help them achieve resilient compliance, protect personal data, and build operational resilience in line with GDPR, EU AI Act, NIS2, DORA, and evolving cybersecurity frameworks. Our sector-specific knowledge ensures practical, risk-based solutions, whether you’re a small charity, a fintech scale-up, or a public body under regulatory scrutiny.

From financial services and healthcare to education, technology, and public sector organisations, we tailor our solutions to address industry-specific risks, data protection requirements, and cybersecurity threats. Our experience spans highly regulated sectors, ensuring businesses remain resilient, compliant, and well-prepared for evolving data protection laws.

Whether you need outsourced DPO support, regulatory audit assistance, or data security guidance, XpertDPO delivers pragmatic, effective solutions through qualified seasoned data protection officers to help you navigate compliance with confidence.

Healthcare

  • Key Focus: Patient data security, GDPR compliance, handling sensitive health records.
  • Challenges: AI innovation implementations and ensuring compliance with GDPR for patient records, managing Subject Access Requests (SARs), mitigating data breaches.
  • How We Help: XpertDPO provides AI and GDPR consultancy, SAR support, and outsourced DPO services for healthcare providers.

Why Data Protection Matters in Healthcare:

The healthcare sector processes highly sensitive patient data, making compliance with jurisdictional regulations such as GDPR, HIPAA (for US-linked entities), and NIS2 crucial for data security, patient confidentiality, and regulatory oversight.

Key Data Protection challenges in Healthcare:

  • Strict Regulatory Requirements: Compliance with GDPR, national health data laws, and cybersecurity directives.
  • Cybersecurity Threats: Increased risk of ransomware attacks and data breaches affecting patient records.
  • Data Sharing & Consent Management: Handling electronic health records (EHRs) and cross-border data transfers.
  • Incident Response & Reporting: Managing breach notification obligations within tight regulatory timeframes.

How We Help Healthcare Organisations with Data Protection:

XpertDPO supports healthcare organisations with GDPR and AI compliance frameworks, DPIAs, cybersecurity risk management, and breach response strategies. Our expertise ensures secure patient data handling, regulatory adherence, and enhanced resilience against cyber threats.

Public Sector

  • Key Focus: Compliance with GDPR and NIS2 for government and public institutions.
  • Challenges: Protecting citizen data, managing regulatory reporting requirements, handling Subject Access Requests (SARs).
  • How We Help: Outsourced DPO services, GDPR audits, and compliance support for government bodies.

Why Data Protection Matters:

Government agencies process citizen data, making compliance with GDPR, NIS2, and national cybersecurity laws essential to prevent data breaches and ensure public trust.

Key Challenges:

  • Strict Data Security Requirements: Meeting GDPR and national security regulations.
  • Cyber Threats & Ransomware Attacks: Government agencies face increasing cyber risks.
  • Handling Public Data Requests: Managing DSARs and Freedom of Information (FOI) requests securely.
  • Cross-Agency Data Sharing Risks: Ensuring lawful, secure data exchanges between departments.

How We Help:

XpertDPO provides public sector data protection audits, regulatory compliance guidance, DSAR and FOI request management, and cybersecurity risk assessments. Our solutions help government bodies enhance data security and public trust.

Financial Services

  • Key Focus: Compliance with GDPR, DORA, and cybersecurity frameworks.
  • Challenges: Protecting financial data, maintaining compliance with evolving regulations, preventing fraud and breaches.
  • How We Help: Advisory services for GDPR, DORA compliance, and supervisory authority engagement.

Why Data Protection Matters in Financial Services:

The financial sector handles highly sensitive customer data, making it a prime target for cyberattacks, fraud, and regulatory scrutiny. Compliance with GDPR, DORA, NIS2, and PCI-DSS is essential to ensure data security, operational resilience, and regulatory adherence.

Key Data Protection Challenges in Financial Services:

  • Regulatory Compliance: Meeting strict GDPR, DORA, and anti-money laundering (AML) obligations.
  • Cybersecurity Risks: Financial institutions are top targets for data breaches, phishing attacks, and ransomware.
  • Third-Party Risk Management: Ensuring vendor and cloud service provider compliance with financial regulations.
  • Incident Response & Reporting: Managing real-time breach response and regulatory notifications.

How We Help Financial Services Organisations with Data Protection:

XpertDPO provides specialist advisory services to help financial institutions navigate DORA, GDPR, and NIS2 compliance, manage third-party risks, and develop resilient cybersecurity frameworks. We offer GDPR audits, incident response planning, DPO support, and vendor risk assessments, ensuring financial organisations meet regulatory expectations while safeguarding sensitive data.

Med Tech

  • Key Focus: Securing medical technology and digital health data under GDPR and NIS2.
  • Challenges: Ensuring data privacy in connected health devices, managing patient data security risks.
  • How We Help: Data protection gap analysis, compliance audits, and risk assessments for MedTech firms.

Why Data Protection Matters in Med Tech:

The MedTech sector is revolutionising healthcare with connected medical devices, digital health solutions, and AI-driven diagnostics. However, these innovations come with strict regulatory requirements, including GDPR, NIS2, MDR (Medical Device Regulation), IVDR (In Vitro Diagnostic Regulation), and HIPAA (for US-linked entities). Ensuring patient data security, regulatory compliance, and ethical AI use is critical for protecting individuals and maintaining trust in medical technology.

Key Data Protection Challenges in Med Tech:

  • Compliance with GDPR, MDR, & NIS2: Managing complex data protection, cybersecurity, and regulatory approval requirements.
  • Securing Patient & Health Data: Protecting electronic health records (EHRs), wearables, and IoT medical devices from cyber threats.
  • Cross-Border Data Transfers & Cloud Security: Ensuring lawful global data processing and third-party compliance.
  • AI & Algorithmic Transparency: Addressing risks in AI-powered diagnostics, automated decision-making, and patient profiling.
  • Incident Response & Regulatory Reporting: Meeting data breach notification obligations within strict timeframes.

How We Help Med Tech Organisations with Data Protection:

XpertDPO provides specialist compliance support for MedTech companies, ensuring GDPR, MDR, and cybersecurity compliance. We assist with DPIAs, AI risk assessments, third-party vendor audits, cybersecurity frameworks, and incident response planning. Our expertise helps MedTech firms secure patient data, meet regulatory requirements, and build trust in digital health solutions.

Ensure compliance and data security in MedTech, contact XpertDPO today.

AI Regulation

  • Key Focus: Ethical and legal compliance for AI-driven data processing.
  • Challenges: Navigating GDPR in AI-based decision-making, transparency requirements, ensuring data security in machine learning models.
  • How We Help: Advisory on AI governance, GDPR compliance for AI systems, and regulatory engagement.

Why Data Protection Matters in AI Regulation

As artificial intelligence (AI) becomes increasingly integrated into business operations, compliance with emerging AI regulations is essential to ensure transparency, fairness, and data protection. The EU AI Act, GDPR, and sector-specific regulations impose strict obligations on organisations developing or deploying AI-driven systems, particularly those handling personal data, automated decision-making, and high-risk applications.

Key Challenges in AI Regulation

  • Compliance with the EU AI Act & GDPR: Ensuring AI systems meet risk classification, transparency, and data protection requirements.
  • Bias, Fairness & Automated Decision-Making: Implementing safeguards to prevent discrimination and ensure lawful AI use.
  • Data Security & Privacy Risks: Protecting training datasets, AI outputs, and personal data from misuse or breaches.
  • Explainability & Accountability: Demonstrating how AI models make decisions, particularly in high-risk applications.
  • Cross-Border AI Deployment: Navigating global regulatory landscapes for AI compliance.

How We Help Organisations comply with AI Regulation

XpertDPO provides AI governance and regulatory compliance services, ensuring businesses align with the EU AI Act, GDPR, and ethical AI principles. We assist with AI risk assessments, bias audits, data protection impact assessments (DPIAs), and regulatory reporting. Our experts help organisations develop responsible AI frameworks, enhance transparency, and mitigate legal risks associated with AI deployment.

Prepare for AI regulation, contact XpertDPO today.

Insurance

  • Key Focus: Data security in policy management and claims processing.
  • Challenges: Managing large volumes of personal data, preventing unauthorised access, ensuring compliance with GDPR.
  • How We Help: GDPR consultancy, data processing audits, and compliance monitoring for insurers.

Why Data Protection Matters in Insurance

The insurance sector processes vast amounts of highly sensitive personal data, including financial, health, and biometric information. Compliance with GDPR, NIS2, DORA, Solvency II, and industry-specific data security regulations is critical to ensuring customer trust, regulatory adherence, and resilience against cyber threats.

Key Data Protection Challenges in the Insurance Sector

  • Handling & Securing Sensitive Customer Data – Processing policyholder, claimant, and medical data while ensuring lawful, secure storage and transfers.
  • Regulatory Compliance & Cross-Border Data Transfers – Meeting GDPR requirements for global operations, including Schrems II and Standard Contractual Clauses (SCCs).
  • Cybersecurity & Fraud Prevention – Protecting against data breaches, ransomware, and fraudulent claims manipulation.
  • Incident Response & Regulatory Reporting – Managing breach notification requirements under GDPR and NIS2.
  • Automated Decision-Making & AI Risks – Ensuring fair, transparent use of AI and automated underwriting systems.

How We Help Insurance Organisations with Data Protection and Artifical Intelligence Compliance

XpertDPO supports insurance providers, brokers, and underwriters with GDPR compliance, data security audits, DORA resilience strategies, and regulatory reporting frameworks. Our outsourced DPO services, DSAR management, incident response planning, and AI governance expertise help insurers meet legal obligations, strengthen cybersecurity, and protect policyholder data.

Need expert data protection support for your insurance firm? Contact XpertDPO today.

Why Sector-Specific Expertise Matters

Compliance is never one-size-fits-all. Each sector faces unique challenges—from safeguarding and social work protocols in care settings to regulatory sandboxes in fintech. At XpertDPO, we blend legal expertise, technical audits, and operational know-how to offer tailored solutions that reflect the real risks and obligations in your field.

Our team includes lawyers, data protection officers, security engineers, and educators—all focused on building trust and reducing risk through pragmatic, compliant practices.

Let’s Talk

Are you looking for outsourced DPO services, DSAR support, AI governance, or regulatory response guidance? Get in touch for a tailored conversation about your sector’s needs.

Email us at info@xpertdpo.com
Visit xpertdpo.com

Outsourced DPO FAqs

What is the GDPR?

The General Data Protection Regulation (GDPR) applies from 25 May 2018. It has general application to the processing of personal data in the EU, setting out more extensive obligations on data controllers and processors, and providing strengthened protections for data subjects. Although the GDPR is directly applicable as a law in all Member States, it allows for certain issues to be given further effect in national law. In Ireland, the national law, which, amongst other things, gives further effect to the GDPR, is the Data Protection Act 2018.

What is personal data?

The GDPR defines ‘personal data’ as any information relating to an identifiable person who can be directly or indirectly identified, in particular by reference to an identifier. This definition provides for a wide range of personal identifiers to constitute personal data, including name, identification number, location data or online identifier, reflecting changes in technology and the way organisations collect information about people.

Who must comply with the GDPR?

Any organisation that processes the personal data of people in the EU must comply with the GDPR. “Processing” is a broad term that covers just about anything you can do with data: collection, storage, transmission, analysis, etc. “Personal data” is any information that relates to a person, such as names, email addresses, IP addresses, eye colour, political affiliation, and so on. Even if an organisation is not connected to the EU itself, if it processes the personal data of people in the EU (via tracking on its website, for instance), it must comply. The GDPR is also not limited to for-profit companies.

What is a data controller and who is the data controller?

Data controllers are a person or organisation who (alone or with others) determines the purposes for which and the manner in which any personal data are, or are to be, processed. A data controller can be the sole data controller or a joint data controller with another person or organisation. However, when services are provided directly by private hospital, voluntary hospitals, agencies or private contractors, the private hospital, voluntary hospital, agency or private contractor may be the data controller.

What is a data processor?

Data processors are those that processes personal data on behalf of the controller. This does not include an employee of the controller who processes data during the course of their employment. A data processor can be held liable if they are responsible for a data protection breach.

What is data processing?

Processing in relation to personal data is any operation or set of operations performed on personal data including – collecting, recording, organising, structuring, erasing, destroying, altering, combining, disclosing or sharing the data.

What are the main GDPR principles?

  • Lawful, Fair and Transparent: Personal Data processed lawfully, fairly and in a transparent manner in relation to individuals;
  • Purpose Limitation: Personal data must be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall not be considered to be incompatible with the initial purposes.
  • Data Minimisation: Personal data collected must be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed
  • Accuracy: Personal data must be kept accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay.
  • Storage Limitation: Personal Data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes subject to implementation of the appropriate technical and organisational measures required by the GDPR in order to safeguard the rights and freedoms of individuals
  • Confidentiality and Integrity: Personal Data must be processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures.
  • Accountability: The Data Controller shall be responsible for, and be able to demonstrate compliance with the data protection principles

Can I send marketing emails using Legitimate Interests as the Lawful Basis for processing if I cannot prove consent?

“Most of our database is made up of historical quotations or previous customers but under GDPR, just because they have gotten a quote from us or bought from us doesn’t actually give us the right to use their data for marketing purposes. Is this correct?”.

Answer: When you originally sold, quoted or marketed products or services did you offer an opt-out at point of sale?

If the answer is yes, you may be able to rely on ‘soft opt-in’.

If you did not offer an ‘opt-out’ then you will need consent. If you cannot reference an affirmative opt-in or consent then you do not have the data subject’s permission, therefore you cannot send marketing emails.

Fig 1: Legitimate Interests Assessment

Remember, it’s PECR (Privacy and Electronic Communications Regulations) that regulates e-marketing NOT GDPR. Legitimate Interests IS NOT a lawful basis for electronic marketing under PECR.

Opt-in has to be specific, informed and freely given and if you are relying on the ‘soft opt-in’ you can only use it for marketing/promotion of your OWN products/services. So an opt-in is the cleanest way to start a new list.

See here some useful links in relation to PECR: Statutory Instrument 336 of 2011 (Ireland) and ICO (UK) Guide to PECR

I have heard that Processing Contracts must be updated for GDPR. What must be included?

The GDPR introduces direct obligations and potential liabilities on the Controller AND Processor. The GDPR requires a legally binding contract between the Data Controller and the Data Processor(s).

There are Compulsory details that must be included:

  • The subject matter and duration of the processing;
  • The nature and purpose of the processing;
  • The type of personal data and the categories of data subject; and
  • The obligations and rights of the controller

Compulsory terms:

  • The processor must only act on the written instructions of the controller (unless required by law to act without such instructions);
  • The processor must ensure that people processing the data are subject to a duty of confidence;
  • The processor must take appropriate measure to ensure the security of processing;
  • The processor must only engage a sub-processor with the prior consent of the data controller and a written contract;
  • The processor must assist the data controller in providing subject access and allowing data subjects to exercise their rights under the GDPR;
  • The processor must assist the data controller in meeting its GDPR obligations in relation to the security of processing, the notification of personal data breaches and data protection impact assessments;
  • The processor must delete or return all personal data to the controller as requested at the end of the contract; and
  • The processor must submit to audits and inspections, provide the controller with whatever information it needs to ensure that they are both meeting their Article 28 obligations, and tell the controller immediately if it is asked to do something infringing the GDPR or other data protection law of the EU or a member state.

What do you we need to document under Article 30 of the GDPR?

  • The name and contact details of your organisation (and where applicable, of other controllers, your representative and your data protection officer)
  • The purposes of your processing
  • A description of the categories of individuals and categories of personal data
  • The categories of recipients of personal data
  • Details of your transfers to third countries including documenting the transfer mechanism safeguards in place
  • Retention schedules
  • A description of your technical and organisational security measures

Should we document anything else?

As part of your record of processing activities, it can be useful to document (or link to documentation of) other aspects of your compliance with the GDPR and the UK’s Data Protection Act. Such documentation may include:

Information required for privacy notices, such as:

  • The lawful basis for the processing
  • The legitimate interests for the processing
  • Individuals’ rights
  • The existence of automated decision-making, including profiling
  • The source of the personal data
  • Records of consent
  • Controller-processor contracts
  • The location of personal data
  • Data Protection Impact Assessment reports;
  • Records of personal data breaches

Information required for processing special category data or criminal conviction and offence data under the Data Protection Act, covering:

  • The condition for processing in the Data Protection Act
  • The lawful basis for the processing in the GDPR
  • Your retention and erasure policy document

XpertDPO publishes submission on EDPB Recommendations on Controller Binding Corporate Rules (BCRs)

XpertDPO has provided feedback to the European Data Protection Board (EDPB) on Recommendations 1/2022 on the Application for Approval and on the elements and principles to be found in Controller Binding Corporate Rules (Art. 47 GDPR).

The EDPB recommendations build upon the agreements reached by data protection authorities during approval procedures on concrete BCR applications since the GDPR came into force. They also bring the existing guidance in line with the requirements in the Court of Justice of the European Union’s Schrems II ruling, providing clarity for controllers relying on Binding Corporate Rules for international data transfers.

XpertDPO welcomes the EDPB’s recommendations and its efforts to ensure a level playing field for all BCR applicants. While the recommendations provide a standard form for the application for approval of BCR for controllers (BCR-C) and clarify the necessary content of a BCR-C, there is still scope for more standardisation to simplify the process for companies considering BCRs as an appropriate safeguard for transfers of personal data to third countries.

XpertDPO’s detailed submission to the public consultation has been published on the EDPB website and can be read here. If you wish to find out more about the approval process for Binding Corporate Rules you can contact XpertDPO at info@xpertdpo.com or on +353 1 678 8997.

GDPR A to Z

The A to Z of GDPR: Glossary of Key Data Protection Terms and Concepts

Welcome to XpertDPO’s definitive GDPR A to Z glossary – your expert guide to the most important terms and principles under the General Data Protection Regulation (GDPR). Whether you’re a Data Protection Officer (DPO), compliance lead, business owner or privacy enthusiast, this glossary breaks down complex data protection terms into clear, practical explanations. Use this page to understand your obligations, educate your team, or support your GDPR training efforts.

Each entry is designed to help you comply with GDPR, respond to regulatory requirements, and embed a culture of data protection in your organisation. From Accountability to Zero Trust, explore the A–Z now.

A is for Accountability

Organisations (e.g., controllers and processors of data) have to be accountable – they have to take responsibility for their compliance with the GDPR (including appropriate organisational and technical measures) and for the data they are processing.

A data breach is the intentional or unintentional release of secure or private/confidential information to an untrusted environment. A breach can potentially have a range of significant adverse effects on individuals, which can result in physical, material or non-material damage. To adequately respond to, and deal with data breaches, your organisation must draft and maintain a detailed data breach policy document. The aim of this document is to outline how your organisation will respond to any such data breach events. Organisations need to be aware of their responsibilities for data breaches, in particular the timeframes and notification responsibilities to their supervisory authority and to data subjects.

B is for Breaches

A data breach is the intentional or unintentional release of secure or private/confidential information to an untrusted environment. A breach can potentially have a range of significant adverse effects on individuals, which can result in physical, material or non-material damage. To adequately respond to, and deal with data breaches, your organisation must draft and maintain a detailed data breach policy document. The aim of this document is to outline how your organisation will respond to any such data breach events. Organisations need to be aware of their responsibilities for data breaches, in particular the timeframes and notification responsibilities to their supervisory authority and to data subjects.

C is for Controllers vs. Processors

Obligations under the GDPR differ depending on whether you are a data ‘controller’ or a data ‘processor’ (note: you can be both!). If your organisation makes the decisions on what data is collected, when it is collected and what it is used for, then there is a high likelihood that you are a controller. Controllers are exposed to the highest level of compliance responsibility – you must comply with, and demonstrate compliance with, all the data protection principles as well as other GDPR requirements. You are also responsible for the compliance of your data processors. A processor does not make decisions around data, rather they process data on behalf of the controller. Processors do not have the same obligations as controllers under the GDPR. However, if you are a processor, you do have a number of direct obligations of your own under the GDPR.

D is for Data Protection Officer

Under Article 37 of the GDPR, organisations are to appoint a Data Protection Officer (‘’DPO’’) if the core activities they carry out are on a large scale, require regular monitoring of data subjects or if the processing is being carried out by a public authority or body. The primary role of the data protection officer (DPO) is to ensure that their organisation processes the personal data of its staff, customers, providers or any other third-party individuals in compliance with the applicable data protection rules.

A DPO should have an adequate level of skill and knowledge and should facilitate compliance and act as an intermediary between the relevant supervisory authority, data subjects, and the organisation. The DPO has to be independent, they cannot hold a position in an organisation where they have the authority to decide the purposes for which personal data is processed and the means by which it is processed and organisations must be careful when using the title “Data Protection Officer” unless the position fulfils all of the criteria set out in the GDPR for appointing a DPO.

E is for European Representative

If you are an organisation which processes data of EU citizens that does not have a branch or establishment in the EEA, you are required to appoint a European Representative. This can be an individual or company that is established in the EEA who can represent you and be a contact point for data subjects and supervisory authorities. XpertDPO act as a European Representative for a number of our international clients. We can assist you with your European Data Protection needs. For more information, please contact us.

F is for Fines

Data protection authorities can impose fines non-compliance with the GDPR. The nature of the infringement determines the fine, as well as which article of the GDPR was infringed upon. Fines can either be:

€10,000,000 or, in case of an undertaking, 2% of total worldwide annual turnover in the preceding financial year (whichever is greater).
€20,000,000 or, in case of an undertaking, 4% of total worldwide annual turnover in the preceding financial year (whichever is higher).
Data protection authorities also have range of corrective powers and sanctions they can enforce, including warnings, reprimands, and bans. Outside of this, individuals also have the right seek compensation for material and non-material damage (material being actual damage that is quantifiable (e.g., loss of money) and non-material damage being any non-financial damage, e.g., pain and suffering).

A DPO should have an adequate level of skill and knowledge and should facilitate compliance and act as an intermediary between the relevant supervisory authority, data subjects, and the organisation. The DPO has to be independent, they cannot hold a position in an organisation where they have the authority to decide the purposes for which personal data is processed and the means by which it is processed, and organisations must be careful when using the title “Data Protection Officer” unless the position fulfils all of the criteria set out in the GDPR for appointing a DPO.

G is for GDPR

The General Data Protection Regulation (‘’GDPR’’) is the primary law that regulates the way that organisations protect the data of EU citizens. It came into force on May 25, 2018. The GDPR ensures that there is a more uniform and consistent approach to data protection across the EU and EEA. It gives individuals control over their data and aims to ensure that fundamental rights and freedoms in relation to personal data are respected.

H is for Having Documentation

Having adequate and accurate documentation under the GDPR is all important – your documentation helps you demonstrate your compliance. Whether it be a set of policies, your Article 30 Records of Processing Activities, data sharing agreements, or copies of audit reports, your documentation should be there to guide you and should evidence the steps you’ve taken to get where you are in your GDPR compliance journey.

I is for Impact Assessments

Data protection impact assessments (‘’DPIAs’’) are required if you are beginning a project that is likely to involve high-risk processing activities. A DPIA will improve your awareness of data protection risks associated with a project. A DPIA should also be carried out for any processing operations that are already underway, or if there have been any changes in your operations. DPIAs should be updated as your organisation changes and implements new technology. DPIAs are not always required, however it is best practice to carry one out if you are not sure as it helps you to comply with data protection law. DPIAs are important tools for accountability, as they help controllers not only to comply with requirements of the GDPR, but also to demonstrate that appropriate measures have been taken to ensure compliance with the Regulation.

J is for Justification (for processing personal data)

Organisations need to have a justification – or grounds for processing data. Without a justification for processing data, it is likely you are processing data illegally. In order to ensure you have justified why it is you processing data you need to have determined your legal bases and purposes for processing data. Organisations should also take the GDPR principles into consideration when assessing their grounds for processing to ensure they are compliant with the GDPR.

K is for Keeping Records of Processing Activities

Records of Processing Activities (‘’RoPA’’) is a form of data inventory that is required under Article 30 of the GDPR. A RoPA is basically a data mapping exercise. Your RoPA should be updated on a regular basis to include why the data is being held, why and how it was originally gathered, how long it is to be retained for, what security measures are in place, the data’s accessibility, and if the data is shared with third parties how, why, and when. Having an up-to-date and accurate RoPA is a key part of GDPR compliance. Organisations are required to provide their RoPA to their supervisory authority on request, and harsh penalties are given to organisations who have not completed this essential GDPR documentation.

L is for Lawful Bases

Organisations must determine the lawful basis for processing prior to processing any data.

Under Article 6 of the GDPR there are only six lawful bases under which Personal Data can be lawfully processed. The six lawful bases are:

  1. Consent: The data subject has given clear consent for you to process their personal data for one or more specific purposes.
  2. Contract: The processing is necessary for a contract you have with the individual, or because they have asked you to take specific steps before entering into a contract.
  3. Legal Obligation: The processing is necessary for you to comply with the law (not including contractual obligations).
  4. Vital Interests: Processing is necessary in order to protect the vital interests of the data subject or of another natural person. One of the strictest principles, generally only used in life-or-death situations.
  5. Legitimate Interest: The processing is necessary for your legitimate interests or the legitimate interests of a third party, unless there is a good reason to protect the individual’s personal data which overrides those legitimate interests.
  6. Public Interest: Processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller.
M is for Mitigating Risk

Complying with the requirements of the GDPR helps to mitigate risks when processing personal data. Carrying out regular audits of your organisation’s policies, documentation and records can help to highlight any gaps or risk areas that need more work. Regular training of staff to ensure they are aware of their responsibilities for the protection of data is essential and helps to mitigate risk. Investing in technology can assist organisations with their GDPR compliance – and this doesn’t need to be big budget technology either. Carrying out Data Protection Impact Assessments will also highlight any risks in projects. It is essential that companies are aware of the risks associated with the data they are processing, and the effects these could have on data subjects. Maintenance of a risk register here is a good way to document your attitude to risk.

N is for Notifying

Notification links in with transparency under the GDPR. Organisations have to notify data subjects in a number of circumstances. It is important to keep communication channels with the individuals whose data you are processing open and accessible. For example, there is an obligation on organisations to notify their supervisory authority and individuals affected of a data breach where the breach presents a risk to the affected individuals.

Where a breach is likely to result in a high risk to the rights and freedoms of individuals, the GDPR states that you must inform those concerned directly and without undue delay. In principle, the relevant breach should be communicated to the affected data subjects directly, unless doing so would involve a disproportionate effort. Another example of notification under the GDPR is having a well-structured, clear, and easily accessible privacy notice that notifies individuals of your purpose and is a public statement of how your organisation applies data protection principles to your data processing activities.

O is for Obligations

Organisations have a number of obligations under the GDPR. The GDPR requires any organisation processing personal data to have identified a valid legal basis and purpose for processing for each processing activity. Organisations need to have:

  • Determined their position as a controller or processor of data
  • Implemented appropriate technical and organisational measures to aid with GDPR compliance
  • Adequate documentation on what personal data is processed, such as the Article 30 Records of Processing Activities
  • Determined what data it is they’re processing, how, what for and for how long
  • Appointed a data protection officer (where required)
  • Processes in place to respond to data subject requests (such as the right to be forgotten)
  • Carried out risk assessments and Data Protection Impact Assessments (where required)
  • Defined procedure around data breaches and notification of breaches
  • Have contracts in place (including data sharing and processing agreements)
  • A risk-based approach to working, with data protection by design and by default
P is for Purpose and Principles

In order to process personal data lawfully under the GDPR, you need a purpose for processing the data. Alongside your purpose, you must determine the lawful basis for processing before you process any data and it is good practise to document the decision-making process. Without defining your purposes for processing, you are processing data illegally.

The GDPR requires organisations to be aware of and comply with seven fundamental principles:

  1. Lawfulness, fairness and transparency: Personal data shall be processed lawfully, fairly and in a transparent manner
  2. Purpose limitation: Personal data shall be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes
  3. Data minimisation: Personal data shall be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed
  4. Accuracy: Personal data shall be accurate and, where necessary, kept up to date
  5. Storage limitation: Personal data shall be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes which the personal data are processed
  6. Integrity and confidentiality: Personal data shall be processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical and organisational measures​
  7. Accountability: The controller shall be responsible for, and be able to demonstrate compliance with the data protection principles​.
Q is for Quantify

GDPR protects the personal data of the individual. The regulation recognises the value of taking a risk-based approach. Risk quantification helps organisations prioritise investments in GDPR compliance. This risk-based approach to privacy regulation using quantification also applies to other data protection and privacy regulations around the globe.

Organisations must invest in controls but may have limited resources in order to do so. Once the obvious must have controls and processes are in place then trade-offs and decisions need to be made to further reduce risk. The data used to inform these decisions is often very subjective and based on worst case scenarios rather than critical thinking informed by quantified data.

We can do this by developing risk scenarios and for each scenario quantify risk from the organization’s point of view and also from the data subjects point of view. Once we have this baseline view of risk scenarios, we can then model the introduction of additional controls and understand how they further reduce risk for each scenario.

This allows us to perform a cost benefit analysis of proposed projects and decide which ones reduce the risk to the organisation the most and also better protect the privacy of the individual. This can be done in the context of the DPIA process or when looking at broader enterprise-wide information security programs.

R is for Rights

Data protection is a fundamental human right. All individuals are entitled to have their data protected, to have it used in a legal manner, to have access to their data and the option to rectify it if it is incorrect.

Under the GDPR, data subjects have eight rights:

  1. Right of access
  2. Right to be informed
  3. Right to rectification
  4. Right to erasure (‘’right to be forgotten’’)
  5. Right to restrict processing
  6. Right to data portability
  7. Right to object
  8. Rights in relation to automated decision making and profiling

Organisations have a limited timeframe to respond to requests from data subjects in regards to their rights under the GDPR of 30 days.

Under the GDPR, the data subject also has recourse to a number of options in the case of a complaint about data protection

  • Right to lodge a complaint with a supervisory authority
  • Right to an effective judicial remedy against a supervisory authority
  • Right to an effective judicial remedy against a controller or processor

Having completed data mapping exercises, policies in place, and your staff trained in how to respond requests, it will help to avoid fines and reputational damage and ensure that individuals requests are responded to, accurately and quickly.

S is for Special Category Data

Special Category Data is personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation. If your organisation processes special category data, you must identify both a lawful basis for general processing under Art. 6 and an additional condition for processing this type of data.

T is for Transparency

Organisations should ensure that they are being transparent in the ways in which they are processing or using data. Transparency is an important principle under the GDPR. Organisations should communicate with data subjects in a clear and accessible way about the ways in which their data is being processed or used (such as in a privacy notice). Transparency means that individuals can trust that your organisation is treating their data ethically and fairly.

U is for Undue Delay

Undue delay is referred to in the GDPR, but what does it actually mean? There is no legal definition for undue delay, and unlike other GDPR requirements such as the 30-day response time for Data Subject Access Requests, the GDPR does not specify any timeframes for this. However, the European Data Protection Board (‘’EDPB’’) has defined undue delay to mean ‘as soon as possible’. Organisations should have compliance fundamentals (such as procedures) in place to ensure they can respond to requests as quickly as they can.

V is for Vetting

Vetting third parties and any data processors should be on any organisation’s radar for GDPR compliance. Gaining an understanding of organisations or service providers you work with and how they handle data ensures higher levels of compliance and reduces risk. Organisations should be looking at where their processors are storing their data and what security is afforded. Data processing agreements and contracts should be in place that detail the terms in writing, for example what happens to your organisation’s data at the end of the contract. Doing your due diligence and assessing any third parties you use reduces risk and can mean a better service is provided.

W is for Why does it matter?

The GDPR gives individuals more control over their personal data and protects their fundamental rights and freedoms. Data protection is a fundamental right set out in Article 8 of the EU Charter of Fundamental Rights. Technological advancements and globalisation have resulted in an increase in the amount of data being shared and individuals increasingly create and share information, a lot of which is public. The GDPR serves to afford individuals more rights around their data.

X is for XpertDPO

XpertDPO provides data security, governance, risk and compliance, GDPR and ISO consultancy to public and private sector organisations.

We help change our clients relationship with the data they process. Data protection, security and governance is at the core of our business. We look after the whole lifecycle of your data processing via our outsourced data protection officer service and our GDPR compliance services. We also provide ISO 27001 and ISO 27701 certification consultancy to our client base, offering a value based, pragmatic approach to achieving certification. We also specialise in offering Nominated European Representative Services to non-EU and non-UK based organisations.

At XpertDPO, our approach is that the data security function must align with, and be driven by, your business objectives. This is at the core of our ethos. XpertDPO can help you to transform the regulatory constraints of the GDPR and other relevant regulations into opportunities, ensuring that your compliance journey has a positive impact on your existing economic and organisational models. Put simply, we take care of your compliance headaches, allowing you to concentrate on your core business goals.

Y is for Yielding Benefits

GDPR compliance can yield a number of benefits for organisations including:

  • It saves money in the long-term
  • Allows for better systems and processes to be developed
  • Organisations are less likely to receive fines and sanctions
  • Compliance boosts confidence in your business
  • Your reputation is upheld as you are not using data for unspecified purposes
  • Valuable employee time is spent on something that ends up of little use – more efficient
  • More effective profiling and understanding of customer/client as the data you hold is accurate
  • Reporting and figures are more accurate, data quality higher
  • Easy way to keep communication channels open with your customers/clients
Z is for Zero Trust

The idea of a ‘Zero Trust’ model of security is to never trust and always verify. The concept of trust can lead to businesses being vulnerable to security risks. Imagine you find out that one of your business’ suppliers was storing your operational secrets in a publicly accessible office with no security? It is important to assess and verify the compliance practices of any third parties. An example of a zero-trust principle in action would be an organisation setting access controls through policy so that only those who need to access the data have access to it. Utilising zero trust methodologies can reduce risks such as data breaches.

Looking to go beyond the basics? XpertAcademy offers practical GDPR training and certification for teams of all sizes. Or, if you need hands-on help with compliance, explore our Outsourced DPO support or DPO Support services. At XpertDPO, we don’t just define data protection – we deliver it.

Bookmark this page and check back as we expand this glossary with new terms and updates in line with evolving privacy regulations like the AI Act, NIS2 and DORA. For real-time compliance advice, contact our team of data protection specialists today.

Who is responsible for demonstrating GDPR compliance?

When we are working with prospective or new clients, we are often asked this question. There isn’t a short answer but we will highlight some steps you can take to begin to demonstrate that you are complying with the GDPR. This is not an exhaustive list by any means, but is intended to be a set of proactive steps.

The General Data Protection Regulation (GDPR) relates to the processing of ‘Personal Data’. Unfortunately, and this is where there is lots of confusion, the GDPR does not provide a definitive list of items that are considered Personal Data. The GDPR, in Article 4(1) states

‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

So, to break that down, the GDPR relates to data that either on its own (directly), or in conjunction with other data (indirectly) that can be used to identify a living human being.

The GDPR also describes the concepts of Data Controllers and Data Processors. A data controller could either be an organisation (e.g. bank, retailer) or an individual (e.g. general practitioner) that collects and processes information about customers, patients, etc. Under the GDPR, the data controller is responsible for ensuring that data is processed in compliance with the principles of lawfulness, fairness, transparency, data minimisation, accuracy, storage limitation, integrity, and confidentiality. A Data Controller generally makes the decisions around the what, why, who, when and how personal data will be processed.

So, if you are running a business and employing staff, you are a data controller for that processing. You must keep in mind here that the GDPR does NOT just apply to large organisations. Individuals, SMEs, community groups and not for profit organisations who process personal data are all responsible for complying with the GDPR. There is no differentiator in the application of the principles of the GDPR in terms of the organisation size.

One of the most important obligations that organisations have is preparing Records of Processing Activity (RoPA). Article 30 of the GDPR details the requirements and responsibilities of Data Controllers in relation to these records.

In our experience, many organisations have inadequate RoPA and there are organisations that are not aware that this is a requirement.

So, if you are a Data Controller, you will need to maintain documentation that details the following:

  1. the name and contact details of the [Data] controller and, where applicable, the joint controller, the controller’s representative and the data protection officer;
  2. the purposes of the processing;
  3. a description of the categories of data subjects and of the categories of personal data;
  4. the categories of recipients to whom the personal data have been or will be disclosed including recipients in third countries or international organisations;
  5. where applicable, transfers of personal data to a third country or an international organisation, including the identification of that third country or international organisation and, in the case of transfers referred to in the second subparagraph of Article 49(1), the documentation of suitable safeguards;
  6. where possible, the envisaged time limits for erasure of the different categories of data;
  7. where possible, a general description of the technical and organisational security measures referred to in Article 32(1).

Many organisations believe that they are not responsible for maintaining these records as they are only small entities. This is, in many cases, incorrect.

There are some exemptions to maintaining these records. Article 30(5) states:

The obligations … shall not apply to an enterprise or an organisation employing fewer than 250 persons unless the processing it carries out is likely to result in a risk to the rights and freedoms of data subjects, the processing is not occasional, or the processing includes special categories of data as referred to in Article 9(1) or personal data relating to criminal convictions and offences referred to in Article 10.

Your organisation may not process data that results in a risk, let alone high risk, to the data subjects. Likewise, you may not process special category data or data relating to criminal convictions or offences but, you will be processing more frequently than occasionally. There is guidance form the European data Protection Board (EDPB) to this effect, they state:

To take account of the specific situation of micro, small and medium-sized enterprises, this Regulation includes a derogation for organisations with fewer than 250 employees with regard to record – keeping’…

Therefore, although endowed with less than 250 employees, data controllers or processors who find themselves in the position of either carrying out processing likely to result in a risk (not just a high risk) to the rights of the data subjects, or processing personal data on a non-occasional basis, or processing special categories of data under Article 9(1) or data relating to criminal convictions under Article 10 are obliged to maintain the record of processing activities.

However, such organisations need only maintain records of processing activities for the types of processing mentioned by Article 30(5).

For example, a small organisation is likely to regularly process data regarding its employees. As a result, such processing cannot be considered “occasional” and must therefore be included in the record of processing activities.

Finally, In addition to the RoPA, it would be recommended that your organisation drafts and maintains a library of supporting documentation in support of your GDPR compliance program. Again, the GDPR does not provide us with a definitive list but, we have seen Data Processing Agreements, Data Protection Policy, ICT Policy, Password Policy, Retention Policy amongst others specifically requested by the Data Protection Commissioner.

We do not advocate that documents are copied from the internet, or even templates being used. These documents will have little or no context in relation to your organisation and how it processes personal data.

Your supporting documentation must give an accurate description of who you are and how your organisation processes data including how long you retain data and who you might share that data with.

Outsourced Data Protection Officer