DPC and EDPB Annual Reports for 2024

This article accompanies Hour 1: Global Privacy Law Updates in our full-day CPD programme on XpertAcademy. Completion of the full one-hour session, including the related learning materials, contributes to the one-hour CPD certificate issued for that session. You can access the course here: CPD Event A: Full-Day Regulatory Privacy Training.

What Organisations Should Take Away

The 2024 annual reports published by the Data Protection Commission (DPC) and the European Data Protection Board (EDPB) are useful for far more than tracking enforcement statistics. Read together, they provide a practical picture of where supervisory attention is being directed, which failures continue to recur, and what organisations should now be reviewing in their own privacy governance.

That is particularly important in Ireland. The DPC sits at the intersection of domestic complaints, breach reporting, supervision and cross-border regulation. The EDPB, in turn, continues to shape a more consistent European approach to enforcement, guidance and technological change. When those two layers are read together, the result is not simply a summary of what happened in 2024. It is a fairly clear indication of what organisations should expect to matter in 2025 and beyond.

For DPOs, compliance leads, legal teams and senior management, the real value of these reports lies in the signals they send. Those signals are not confined to fines. They touch the recurring weaknesses that show up in access rights, breach handling, operational discipline, role clarity, AI governance, board visibility and the overall ability of an organisation to evidence accountability.

In this article, we look at the most significant themes emerging from the DPC and EDPB annual reports for 2024 and, for each one, we set out what we commonly see in practice when organisations try to operationalise these obligations.

Enforcement remains active, but the more useful lesson is what sits behind it

The DPC’s 2024 Annual Report records 11 finalised inquiry decisions and over €652 million in administrative fines. It also shows a regulator dealing with substantial operational volume: over 32,000 contacts, 11,091 new cases processed, and 10,510 cases resolved in 2024 alone. By year end, 89 statutory inquiries remained on hand, while four large-scale inquiries were concluded during the year.

Those figures matter, but for most organisations the more useful lesson is not the amount of fines in isolation. It is what the DPC’s workload tells us. Data protection compliance remains a live governance issue. The DPC’s work in 2024 ranged across cross-border inquiries, domestic breach cases, CCTV, children’s data, AI model development, biometrics, sensitive health data, supervision activity, and legislative engagement. That is a broad and demanding field of regulatory attention.

The DPC’s foreword is also worth noting for tone as much as substance. It emphasises fairness, consistency and transparency, but it also states clearly that while organisations have flexibility in how they structure compliance, they remain accountable for the choices they make to both individuals and regulators. In other words, flexibility in approach is not a substitute for evidence of control.

The EDPB annual report reinforces that broader direction of travel. Its 2024 – 2027 strategy is built around four pillars: advancing harmonisation and promoting compliance, reinforcing a common enforcement culture, addressing technological challenges, and enhancing the EDPB’s global role. That is a useful frame for organisations because it shows that data protection authorities are not only reacting to specific infringements. They are also building a more structured, more consistent, more cross-border regulatory environment.

In practice, one of the recurring difficulties is that organisations still tend to treat regulatory developments as something that happens externally, rather than as indicators of how their own internal arrangements are likely to be assessed. A fine against a large platform is often viewed as distant from the experience of an ordinary Irish organisation. The more useful reading is usually the opposite. The issue is not whether an organisation will face the same factual pattern as a large-scale inquiry. The issue is whether the same underlying governance weaknesses are present in a smaller, less visible form.

We also often see organisations underestimate how much the DPC’s broad activity matters outside formal enforcement. Complaints handling, supervision, public guidance, case studies and legislative input all tell organisations something about regulatory expectations. If the privacy function only looks at headline fines, it will miss the more practical indications of what the regulator continues to see going wrong.

In terms of internal considerations for your organisation, a sensible first step is to move away from reading annual reports as backward-looking summaries. They should be read as practical indicators of:

  • which issues continue to generate friction
  • which operational weaknesses are still common
  • which areas are likely to receive more structured attention in the near future

For boards and senior management, the key point is that privacy remains a current governance issue and not simply a legal housekeeping topic.

Regulatory activity remains substantial. The DPC’s 2024 Annual Report records 11 finalised inquiry decisions and over €652 million in fines, alongside 11,091 new cases processed and 10,510 cases resolved. The broader implication is that privacy compliance continues to be an active governance and accountability issue rather than a background legal function.

Access rights remain one of the clearest indicators of programme weakness

Access rights remain a central theme in the DPC’s report. Subject access requests account for the largest share of national complaints, representing 34% of all complaints received, followed by fair processing at 17% and the right to erasure at 14%. By the end of 2024, the DPC had received 914 new complaints solely related to the right of access and concluded 904.

The DPC also explains why so many of these complaints arise. In many cases, organisations either fail to respond within the required timeframe or apply redactions and exemptions without giving sufficiently clear explanations. The report is explicit that it is not enough merely to list an exemption or cite legislation. The reason the exemption is being applied should be clearly explained, and those decisions should be documented.

The DPC case studies make this even more concrete. A hospital only provided access data after DPC intervention, despite the urgency of the matter. A financial services provider withdrew its reliance on restrictions and released previously withheld personal data only after the DPC challenged the legal basis for withholding it. Another organisation initially over-redacted records and later re-released them in partially redacted format after engagement with the DPC on the balancing exercise required by Article 15(4) GDPR.

The EDPB report places this in a wider European context. One of its 2024 highlights was the launch of a coordinated enforcement action on the right of access in February 2024. That is a useful signal. Access rights are not just a recurring domestic irritation; they remain a live topic of coordinated supervisory interest across Europe.

In practice, access rights often expose broader weaknesses in privacy governance. The legal right itself is not usually the main difficulty. The recurring problem is the organisation’s operational ability to comply consistently. Issues would be:

  • uncertainty over who owns the request internally
  • incomplete searches across mailboxes, shared drives or business systems
  • over-reliance on individual employees who “know where the data is”
  • poorly documented decisions on exemptions and redactions
  • difficulty explaining why certain information has been withheld
  • delays caused by weak coordination between legal, HR, operations and IT

A common issue is that organisations assume DSAR handling is mainly about retrieval. In practice, it is equally about judgement, explanation and evidencing of reasoning. Where that reasoning is weak or overly informal, the response often becomes difficult to defend.

Another recurring issue is that access rights are treated as episodic rather than systemic. If there is no repeatable internal workflow, organisations end up re-solving the same problem each time a request comes in. That is one reason DSAR performance often becomes such a useful indicator of the maturity of a privacy programme more generally.

Access rights are often one of the first places where weak accountability becomes visible so what can you do? Review whether your access request process is genuinely operationalised:

  • Is ownership clear?
  • Can you demonstrate adequate searches?
  • Are exemptions and redactions documented properly?
  • Can the reasoning be explained clearly to the individual and, if necessary, to the DPC?

Access rights remain a key area of exposure. Subject access requests account for 34% of complaints received by the DPC. This makes DSAR handling a practical indicator of whether privacy governance is functioning effectively, particularly in relation to timelines, searches, exemptions, redactions and decision-making quality.

Breach trends still point to ordinary operational weakness

The DPC received 7,781 valid breach notifications in 2024, an 11% increase on the previous year. Of those notifications, 81% were concluded by year end. The most important practical point, however, lies in the cause of those breaches. Fifty per cent of notified cases arose because correspondence was sent to the wrong recipient.

The DPC’s breach chapter develops that point in more detail. The highest category of breaches continues to involve unauthorised disclosures affecting single individuals or small groups, with poor operational practices and human error remaining prominent. The detailed breakdown shows:

  • 32% postal material to incorrect recipient
  • 14% email to incorrect recipient
  • 10% accidental or unauthorised alteration
  • 8% accidental loss or destruction
  • 5% hacking

This is one of the most useful themes in the annual report because it corrects a common misconception. Many organisations continue to associate privacy incidents primarily with cyber compromise. The DPC’s figures show that ordinary administrative failures remain an equally important source of regulatory exposure.

The DPC’s case studies reinforce this. In the broadcasting-sector phishing case, an employee was deceived into disclosing credentials, leading to unauthorised access to personal and special category data, with the DPC pointing to improved filters, training and revised procedures as part of the response. In another case, a third-level institution published non-anonymised survey data, leading to review of internal reporting processes and stronger controls. In another general case study, the DPC addressed the forwarding of work emails and special category data to a personal email account, again highlighting the importance of both technical and organisational controls.

In practice, many organisations have incident notification procedures but less mature arrangements for reducing repeat incidents over time. The response process exists, but the learning loop is weaker.  Observed is:

  • good incident logging, but weak analysis of recurring themes
  • legal and privacy teams involved late rather than early
  • corrective actions agreed in principle, but not tracked to closure
  • overly narrow focus on whether a breach is reportable, rather than why it happened
  • underinvestment in very ordinary controls, especially around correspondence, manual handling, exports and publication

A recurring issue is that operational teams may not see privacy incidents as part of governance. They may be treated as isolated mistakes rather than symptoms of a process or control issue. That makes repeat incidents much more likely.

Another common difficulty is that breach reporting to senior management can be descriptive rather than managerial. An incident is noted, but the question of whether similar incidents are reducing over time is not always asked clearly enough.

Breach management is not just about notification; it is also about demonstrable reduction in repeated failure. A takeaway would be to review your incident landscape in practical terms:

  • What are your most common breach types?
  • Are the same errors recurring?
  • Which ones are administrative and therefore more controllable?
  • Can you show that lessons learned have led to specific changes?

Breach trends remain heavily operational. The DPC received 7,781 valid breach notifications in 2024, with 50% arising from correspondence being sent to the wrong recipient. This supports continued focus on administrative controls, checking procedures, staff discipline and repeat-incident reduction rather than relying solely on breach notification workflows.

The case studies show that basic accountability failures remain common

The value of the DPC’s case studies is that they move the discussion away from abstract trends and show what organisations are still getting wrong in concrete terms. They reveal repeated issues in timing, documentation, explanation, role clarity and ordinary operational judgement.

Access request case studies show delayed responses, weak searches, poor handling of exemptions, and over-redaction. The controller/processor case study shows the continuing importance of properly identifying roles and understanding who must actually respond to a rights request. The personal-email case study illustrates how ordinary staff behaviour can create significant exposure, especially where special category data is involved. The rectification case study shows how customer service issues can quickly become privacy complaints where inaccurate personal data causes practical harm.

These are not exotic scenarios. That is precisely why they are useful. They remind organisations that a significant part of data protection compliance still comes down to the quality of routine governance and service delivery.

In practice, one of the recurring mistakes is to assume that serious data protection risk only arises in large-scale or technologically complex contexts. Very often, the issue is much more ordinary:

  • a request is not picked up on time
  • the search is incomplete
  • the explanation is weak
  • the agreement is unclear
  • the wrong document is shared
  • the control exists on paper but not in day-to-day behaviour

We also see organisations separate service problems from privacy problems too sharply. In reality, the boundary is often thin. Inaccurate data, weak complaint-handling, poor customer correspondence or unclear internal ownership can all become data protection issues very quickly when they affect rights or outcomes.

You can use the DPC’s case studies as a practical audit tool:

  • Which of these failures could happen here?
  • Which already have?
  • Are our controls strong enough to prevent them?
  • Are our staff sufficiently clear on what to do when something goes wrong?

EU coordination is becoming more important, not less

The DPC’s annual report reflects an Irish regulator operating in an increasingly coordinated European environment. The EDPB’s report makes that development more explicit.

The EDPB’s 2024 – 2027 strategy focuses on harmonisation, common enforcement culture, technological challenges and cross-regulatory cooperation. It also highlights the expanding responsibilities of data protection authorities in the context of the AI Act, DMA, DSA, Data Act and other digital frameworks. The Board notes that it issued 28 consistency opinions in 2024, including eight under Article 64(2), designed to address matters of general application or major cross-border relevance.

The report also underlines the role of coordinated enforcement. The 2024 highlights include a coordinated enforcement report on the role of DPOs, the launch of a coordinated enforcement action on the right of access, and the adoption of Opinion 28/2024 on AI models in December 2024.

This is important because it means organisations should increasingly assume that core GDPR issues are being understood within a shared European framework. That affects not only multinational businesses. It also affects domestic organisations whose practices touch issues that are receiving coordinated regulatory attention, such as access rights, AI, profiling, children’s data or cross-border services.

In practice, organisations often track DPC developments more closely than EDPB developments. That is understandable, but it can leave a gap in strategic awareness. We often see privacy governance shaped around domestic complaint themes, immediate contractual issues, specific incidents, and/or sector expectations. What can get missed is the extent to which EU-level consistency work shapes the direction of travel. That means organisations sometimes respond to a theme too late, after it has already become part of a broader supervisory agenda.

Another practical issue is that some organisations still assume that “European” developments only matter where large-scale cross-border processing is involved. Increasingly, that is too narrow. If a theme is receiving EDPB-level attention, it often signals a broader expectation of consistency that will eventually affect ordinary organisational practice as well.

Do not read the DPC report in isolation. The more useful question is:

  • what themes are being reinforced at EU level?
  • where is consistency increasing?
  • what does that suggest about where scrutiny is likely to deepen next?

The regulatory environment is becoming more coordinated across the EU. The EDPB’s 2024 – 2027 strategy focuses on harmonisation, enforcement culture, technological challenges and cross-regulatory cooperation. Organisations should assume that core privacy risks are increasingly being assessed in a more consistent European framework.

AI has moved firmly into mainstream privacy governance

One of the clearest themes in both the DPC and EDPB material is the centrality of AI to current regulatory thinking.

The DPC’s foreword states that regulation of AI model training attracted a great deal of public interest in 2024 and notes that new inquiries were commenced into AI models, biometrics and the security of sensitive health data. Its 2024 timeline records DPC engagement with Meta’s LLM plans, High Court proceedings concerning X’s Grok processing, the launch of an inquiry into Google’s AI model, and the DPC’s request to the EDPB for an Article 64(2) opinion on the use of personal data for development and deployment of AI models.

The EDPB annual report adds the broader European layer. Its foreword explains that the Board adopted an opinion on the use of personal data to train AI models in order to support responsible AI innovation while ensuring protection of personal data and compliance with the GDPR. The same section notes that AI developers can use legitimate interests as a legal basis for model training under certain conditions and that the EDPB set out a structured three-step test to help developers determine lawful use.

This is an important regulatory message. AI is not treated as outside GDPR. Nor is it treated as unlawful by default. It is treated as something that must be governed within ordinary accountability structures, using disciplined analysis rather than assumption.

The EDPB also makes clear that the AI Act and other digital legislation are expanding the responsibilities of DPAs and intensifying cross-regulatory cooperation. For organisations, that means AI governance is likely to become more, not less, integrated with privacy, product, risk and regulatory oversight.

In practice, AI adoption often moves faster than governance. Organisations begin using AI tools, copilots, model-based services or AI-enabled vendors before internal accountability arrangements have caught up. Recurring issues include:

  • uncertainty over lawful basis
  • weak transparency analysis
  • unclear supplier roles and sub-processing chains
  • underdeveloped DPIAs or no DPIA refresh at all
  • insufficient clarity on what personal data is being used, where, and for what purpose
  • treating AI as an innovation topic first and a governance topic second

A common issue is that organisations may have sensible general privacy controls but have not yet adapted them to AI-related realities. For example, vendor diligence may not yet ask the right questions about model training, retention, downstream use or human oversight. Similarly, internal teams may not yet distinguish clearly between use of an AI-enabled tool and development or deployment of an AI system with a materially different risk profile.

What to do? Map where AI is already present:

  • internal tools
  • external vendors
  • product features
  • workflow automation
  • model-assisted decisions

Then ask:

  • is this captured in our privacy governance?
  • has lawful basis been assessed properly?
  • do our notices and internal records reflect reality?
  • do we know what our vendors are doing with personal data?

AI is now part of mainstream privacy governance. Both the DPC and EDPB treated AI model development and deployment as core regulatory topics in 2024. AI-related processing should therefore be governed through existing privacy, risk and accountability structures rather than treated as a separate informal innovation stream.

DPIAs, role clarity and processor accountability remain highly practical issues

Even where the annual reports do not dwell on DPIAs as a standalone theme, they reinforce the wider accountability expectations that make DPIAs and role clarity so important.

The DPC’s access-related case studies show that role confusion still arises, particularly around controller and processor responsibilities. In one case, the DPC accepted that an organisation was acting as a processor and had complied with its obligations by referring the request to the controller, supported by a detailed written agreement setting out roles and instructions. This is a useful reminder that outsourcing does not remove responsibility. Rather, it increases the need for clear role definition and operational coordination.

The DPC’s annual report also shows how much emphasis continues to fall on practical explanations, evidence, and ability to justify decisions. That same logic applies to DPIAs and other risk assessments. It is no longer sufficient to have a template completed somewhere in the project file. The question is whether risk has been assessed at the right time, whether alternatives have been considered, whether decisions can be followed, and whether safeguards are reflected in actual controls.

In practice, role clarity and risk assessment still cause difficulty. DPOs see:

  • processor agreements that exist, but do not really support day-to-day rights handling or breach response
  • unclear internal understanding of who is controller, processor or joint controller in more complex service chains
  • DPIAs drafted too late in the project lifecycle
  • risk assessments that identify issues but do not clearly drive design or operational change
  • mitigations that are described, but not obviously tied to real controls

These issues often become more acute in AI-related, vendor-heavy or fast-moving projects. Where several parties are involved, or where technology adoption is proceeding quickly, the temptation is often to finalise role allocation and risk analysis after the main decisions have already been made.

Useful review points are:

  • whether processor/controller roles are clearly documented and understood
  • whether key agreements support rights handling, incident management and accountability
  • whether DPIAs are being carried out early enough and updated where processing changes
  • whether risk assessments are functioning as decision tools rather than paperwork

The common message is visible, measurable and auditable accountability

Taken together, the DPC and EDPB material points in a common direction. Privacy programmes are increasingly expected to be visible to decision-makers, measurable in practice and capable of withstanding scrutiny.

The DPC’s values include accountability, fairness, consistency and transparency. The EDPB strategy places emphasis on harmonisation, enforcement, practical guidance and technological readiness. Both sets of material suggest that regulators are looking beyond the existence of policies. The more important question is whether organisations can show how privacy governance actually works.

That means showing:

  • how rights are handled
  • how incidents are learned from
  • how high-risk processing is assessed
  • how senior management is informed
  • how improvements are tracked
  • how accountability is evidenced over time

In practice, board and executive visibility remains uneven. Many organisations do report privacy issues upwards, but the reporting is not always sufficiently management-focused. Accountability reporting can be:

  • narrative-heavy reporting with limited metrics
  • updates that describe activity but do not clearly show trend or risk
  • breach reporting without repeat-incident analysis
  • rights reporting without process-health indicators
  • DPO reporting lines that technically exist but do not create real organisational visibility

A recurring issue is that privacy becomes visible to leadership after an incident, but less visible in advance of one. That makes it harder for the organisation to demonstrate proactive accountability.

In your organisation, ask the following:

  • What does the board actually see on privacy?
  • Are privacy metrics meaningful and decision-useful?
  • Can the organisation show trends, not just isolated events?
  • Is accountability visible before regulatory or reputational issues arise?

Current expectations increasingly favour privacy programmes that are visible, measurable and auditable. Regulators are looking beyond policies to whether organisations can show functioning governance, practical control, clear ownership and evidence of remediation.

Summary

The DPC and EDPB annual reports are useful not only because they describe the last year of regulatory activity. They are useful because they show where pressure continues to build and what kinds of organisational weakness remain most likely to matter.

Many of the issues that continue to generate complaints, breaches and supervisory attention are not new. They are recurring weaknesses in access handling, explanation, operational discipline, accountability, role clarity and governance visibility. What is changing is the environment in which those weaknesses are being judged. It is becoming more coordinated, more structured and more technologically alert.

For many organisations, the real challenge is not understanding GDPR in principle. It is embedding that understanding into ordinary governance, processes, decision-making and reporting. That is what the DPC and EDPB annual reports help to illuminate, and that is why they remain worth reading carefully.

This article is intended to support the learning covered in Hour 1 of our XpertAcademy CPD programme. The relevant CPD certificate is issued for completion of the full one-hour session on XpertAcademy, rather than for reading this article on its own. You can return to the course here: CPD Event A: Full-Day Regulatory Privacy Training.

Ready to start your Data Protect journey with us?

Data Protection Officer Services