Module 20 of 26 · Applied

Risk, ethics, and strategic value

15 min read 3 outcomes Interactive + drag challenge 5 standards cited

By the end of this module you will be able to:

  • Explain the right to erasure under GDPR
  • Connect data ethics to business risk and reputation
  • Evaluate data-related risk scenarios
Technology office with screens showing data privacy compliance

Real-world consequence · January 2023

Meta fined €390 million for using personal data without valid consent.

In January 2023, the Irish Data Protection Commission fined Meta €390 million for processing personal data for behavioural advertising without valid consent. Meta had relied on "contractual necessity" as the legal basis, arguing that personalised ads were part of the service users agreed to.

The DPC ruled that advertising was not necessary for the performance of the contract (providing a social media service). This module connects the ethics principles from Foundations to applied risk assessment and strategic decision-making.

Meta argued that agreeing to Terms of Service constituted consent for personalised advertising. The Irish DPC disagreed. What is the difference between contractual necessity and consent under GDPR?

Data risk is not just about breaches and fines. It includes reputational damage from unethical use, strategic loss from poor data quality, and operational failure from governance gaps. This module integrates the ethical principles from Module 11 with the practical governance and architecture knowledge from the Applied stage.

With the learning outcomes established, this module begins by examining the right to erasure in depth.

20.1 The right to erasure

GDPR Article 17 grants data subjects the right to request deletion of their personal data (the "right to be forgotten"). The controller must comply without undue delay when: the data is no longer necessary for its original purpose, consent is withdrawn, the data was unlawfully processed, or erasure is required by law.

Implementing erasure is harder than it sounds. Module 9 established that data exists in production databases, backups, data warehouses, audit logs, and derived datasets. Erasing from one location while copies persist elsewhere does not satisfy the requirement. Organisations need a deletion inventory that maps every location where personal data resides.

The controller shall have the obligation to erase personal data without undue delay where the data subject withdraws consent and where there is no other legal ground for the processing.

GDPR Regulation (EU) 2016/679 - Article 17(1)(b)

The right to erasure is not absolute: it must be balanced against other legal obligations (tax records must be retained regardless of erasure requests), freedom of expression, and public health interests. But when it applies, the obligation is immediate and thorough.

Common misconception

We can anonymise the data instead of deleting it to satisfy erasure requests.

Anonymisation can satisfy erasure requirements only if the anonymisation is irreversible. If the data can be re-identified using available external datasets (as Latanya Sweeney demonstrated with just three demographic fields), it remains personal data. Pseudonymisation (replacing identifiers with tokens) does not satisfy erasure because the data remains identifiable via the token mapping.

With an understanding of the right to erasure in place, the discussion can now turn to data risk and reputation, which builds directly on these foundations.

Digital network visualisation representing the complexity of data deletion across interconnected systems
Erasing personal data requires identifying every copy across production, backups, archives, and derived datasets. The complexity scales with system interconnection.

20.2 Data risk and reputation

Data risk operates at three levels: regulatory (fines and enforcement actions), reputational (loss of customer trust and media scrutiny), and strategic (missed opportunities from poor data quality or governance gaps).

The Meta fine illustrates regulatory risk. The Cambridge Analytica scandal (2018) illustrates reputational risk: Facebook lost approximately $100 billion in market capitalisation in the ten days following the revelation that 87 million users' data was harvested without consent. The NHS mortality statistics error (Module 8) illustrates strategic risk: incorrect data led to incorrect conclusions about hospital performance.

Data protection is not a barrier to innovation. It is a framework for responsible innovation that maintains public trust.

Elizabeth Denham, UK Information Commissioner (2020) - ICO Annual Report foreword

Denham's framing is important for practitioners: compliance is not the enemy of data value. Organisations that build trust through responsible data practices can do more with data, not less, because customers, regulators, and partners are willing to share more when trust is established.

Common misconception

GDPR compliance is just a cost centre with no business value.

Organisations with strong data governance report faster decision-making (because trusted data requires less verification), lower customer acquisition costs (because trust enables data sharing), and reduced regulatory exposure. A 2022 Cisco study found that organisations investing in privacy saw an average return of 1.8x on their privacy spending. Compliance is an investment, not a cost.

Corporate boardroom session reviewing data governance dashboards tracking regulatory fines, reputational damage, and strategic losses
Data risk encompasses regulatory fines, reputational damage, and strategic losses from poor governance. Board-level oversight across all risk categories is essential for responsible data management.
Loading interactive component...
20.3 Check your understanding

A customer submits a GDPR erasure request. Your team deletes their record from the CRM and the data warehouse. A month later, an audit discovers the customer's data still exists in backup tapes and a machine learning training dataset. Is the erasure complete?

Cambridge Analytica harvested 87 million Facebook users' data in 2018. Facebook lost approximately $100 billion in market capitalisation in ten days. Which type of data risk does this primarily illustrate?

A company pseudonymises customer data by replacing names with random tokens, then stores the token mapping in a separate database. A GDPR erasure request arrives. Can the company satisfy the request by deleting only the token mapping?

Loading interactive component...

Key takeaways

  • GDPR Article 17 grants the right to erasure. Implementation requires identifying every copy of personal data across production, backups, archives, and derived datasets. Pseudonymised data remains personal data under GDPR.
  • Data risk operates at three levels: regulatory (fines), reputational (trust loss, market cap decline), and strategic (poor decisions from bad data). Reputational risk typically produces the largest single-event financial impact.
  • Meta's EUR390 million fine (2023) and the Cambridge Analytica scandal (2018) demonstrate that data ethics failures have concrete financial consequences far beyond regulatory penalties.
  • Privacy investment generates returns. Cisco's 2022 study found an average 1.8x return on privacy spending. Compliance is an investment in trust, not merely a cost.

Standards and sources cited in this module

  1. GDPR Regulation (EU) 2016/679

    Article 17 (Right to erasure), Recital 26 (Pseudonymisation)

    Legal basis for erasure rights and the distinction between pseudonymisation and anonymisation.

  2. Irish Data Protection Commission, Decision on Meta (January 2023)

    Full decision

    Source for the EUR390 million fine and the ruling that contractual necessity does not cover behavioural advertising.

  3. Cisco, 'Data Privacy Benchmark Study' (2022)

    ROI section

    Source for the 1.8x average return on privacy spending figure.

  4. UK ICO Annual Report 2020

    Commissioner's foreword

    Elizabeth Denham's framing of data protection as enabling responsible innovation.

  5. FTC Settlement with Facebook (2019)

    Full order

    The $5 billion FTC settlement following the Cambridge Analytica scandal. Context for reputational vs regulatory risk discussion.

Module 20 of 26 · Applied Data