Module 23 of 25 · Practice & Strategy

Privacy, ethics, and auditability

30 min read 4 outcomes GDPR data mapper + drag challenge 5 standards cited

By the end of this module you will be able to:

  • Apply the seven UK GDPR Article 5 data protection principles to a system design
  • Implement privacy by design under GDPR Article 25 using data minimisation and PETs
  • Describe the Computer Misuse Act 1990 provisions relevant to authorised security testing
  • Design a responsible disclosure programme that protects both researchers and the organisation
Privacy and data protection

British Airways data breach, August to September 2018

British Airways: Magecart JavaScript injection, 500,000 customers, £20 million ICO fine

In September 2018, British Airways disclosed a data breach affecting approximately 500,000 customers. An attacker had injected malicious JavaScript into the British Airways booking website, which silently harvested payment card details and personal information as customers entered them during checkout. The code was active for two weeks before detection.

In October 2020, the Information Commissioner's Office issued British Airways a fine of £20 million under GDPR Article 83, reduced from an initial notice of £183 million. The ICO found that British Airways had failed to implement adequate technical and organisational measures to protect personal data: specifically the absence of multi-factor authentication on the systems accessed, insufficient access control, and inadequate event logging.

The attack exploited the intersection of Article 5(1)(f) (the integrity and confidentiality principle) and Article 25 (privacy by design): adequate controls were available but had not been implemented.

UK GDPR Article 5: the seven principles

The UK GDPR (retained EU GDPR under UK domestic law via the European Union Withdrawal Act 2018, enforced by the ICO) applies to any processing of personal data about UK data subjects regardless of where the controller is located. Article 5 defines seven principles governing all personal data processing:

  1. Lawfulness, fairness, and transparency (5(1)(a)): processing must have a lawful basis; data subjects must be informed.
  2. Purpose limitation (5(1)(b)): data collected for one purpose may not be repurposed without a fresh legal basis.
  3. Data minimisation (5(1)(c)): collect only what is adequate and necessary for the stated purpose.
  4. Accuracy (5(1)(d)): personal data must be accurate and kept up to date.
  5. Storage limitation (5(1)(e)): data retained no longer than necessary for the purpose.
  6. Integrity and confidentiality (5(1)(f)): appropriate technical and organisational security measures required against unauthorised or unlawful processing, accidental loss, destruction, or damage.
  7. Accountability (5(2)): the controller must be able to demonstrate compliance with all other principles.

Article 25 specifies what "appropriate" means in design terms: data protection measures must be implemented at the time of system design, not retrofitted. This is privacy by design. Privacy by default means protective settings are the default state; users must opt in to additional data collection rather than opting out.

With an understanding of uk gdpr article 5: the seven principles in place, the discussion can now turn to privacy enhancing technologies (pets), which builds directly on these foundations.

Privacy Enhancing Technologies (PETs)

PETs are technical controls that minimise personal data use or protect it in ways that enable useful processing while reducing privacy risk. They implement privacy by design at the technical layer.

  • Pseudonymisation: replace identifiers with pseudonyms; store the mapping separately. Enables analytics without direct identification; data remains personal data under UK GDPR.
  • Anonymisation (k-anonymity, l-diversity): remove or generalise data until re-identification is not feasible. Anonymised data falls outside UK GDPR. Suitable for releasing aggregate research datasets.
  • Differential privacy: add calibrated statistical noise to query results. Used for publishing statistics without revealing individual records (used by UK Census 2021).
  • Homomorphic encryption: compute on encrypted data without decrypting. Enables cloud analytics on sensitive data.
  • Tokenisation: replace sensitive values (payment card numbers, national insurance numbers) with non-reversible tokens. PCI DSS requires tokenisation or encryption for cardholder data at rest.

The UK NHS COVID-19 contact tracing app (2020) used a decentralised architecture as a privacy by design choice: proximity detection exchanged anonymous random identifiers via Bluetooth Low Energy; matches were computed on the user's own device. The NHS never received a list of who had been in proximity to whom. The privacy trade-off was explicitly documented: lower epidemiological data utility in exchange for stronger privacy guarantees and higher voluntary adoption.

With an understanding of privacy enhancing technologies (pets) in place, the discussion can now turn to computer misuse act 1990 and responsible disclosure, which builds directly on these foundations.

The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed.

UK GDPR, Article 25(2): Data Protection by Default

Data protection by design and by default requires you to put in place appropriate technical and organisational measures to implement data protection principles effectively and safeguard individual rights. You must do this both at the time you are determining the means for processing and at the time of the processing itself.

ICO Guide to the UK GDPR, Article 25: Data Protection by Design and by Default (2021)

Computer Misuse Act 1990 and responsible disclosure

The Computer Misuse Act 1990 is the primary UK legislation criminalising unauthorised computer access. Three main offences: Section 1 (unauthorised access to computer material), Section 2 (unauthorised access with intent to commit further offences), and Section 3 (unauthorised acts with intent to impair). The word "unauthorised" is critical. Security testing performed with explicit written authorisation from the system owner is not an offence.

Lawful penetration testing requires: a scope agreement in writing specifying which systems, IP ranges, and test types are authorised; time-bounded authorisation; rules of engagement; and separate consideration of third-party services. Authorisation from the target system's owner does not extend to third-party SaaS platforms, cloud providers, or payment processors that share infrastructure. Bug bounty scope boundaries are legally binding: a researcher who tests systems outside a stated scope may be prosecuted under the Computer Misuse Act even if they find a genuine vulnerability and report it responsibly.

A well-designed bug bounty programme includes: clear written scope with explicit in-scope and out-of-scope asset lists; safe harbour language stating the organisation will not pursue legal action against researchers acting within scope and in good faith; a reward structure with minimum and maximum bounty amounts per severity tier; a response SLA (triage within X days, status update within Y days, payment within Z days of patch release); and an acknowledgement policy. Responsible disclosure (coordinated vulnerability disclosure) allows researchers to set a disclosure deadline (typically 90 days as per Google Project Zero policy) after which details are published regardless of patch status.

Common misconception

Security logs and GDPR accountability records must be maintained as separate systems with separate processes.

Maintaining separate 'GDPR logs' and 'security logs' as distinct systems with different retention and access controls is expensive and creates inconsistency when both records need to support an ICO investigation. Design a single structured logging pipeline satisfying both security detection requirements (who accessed what, when, from where) and GDPR accountability requirements (Article 5(2) evidence of compliance) from the same source events. Apply appropriate access controls to ensure personal data in logs is accessible only to authorised personnel. This approach reduces duplication and ensures the two records are never inconsistent.

Common misconception

Encrypting personal data with a strong cipher (AES-256) satisfies the GDPR Article 5(1)(f) integrity and confidentiality principle.

Encryption addresses confidentiality at rest. Article 5(1)(f) also requires protection against unauthorised or unlawful processing, accidental loss, destruction, or damage. A well-encrypted database that is accessible to every authenticated employee with no access controls, no access logging, and no data minimisation satisfies the encryption requirement but violates Article 5(1)(f) in every other dimension. The British Airways ICO enforcement action specifically cited absent MFA, insufficient access controls, and inadequate logging as separate failures, not a single deficiency. Compliance requires the full combination of controls, not any single technical measure.

Privacy compliance combining technical controls (encryption, access logs) with procedural controls (DPIAs, retention policies) for GDPR and UK DPA 2018
Privacy compliance requires technical controls (encryption, access logs) combined with procedural controls (DPIAs, retention policies) to meet GDPR and UK DPA 2018 requirements.
Loading interactive component...
Loading interactive component...
Check your understanding

A health technology company collects continuous heart rate data from wearable devices to detect atrial fibrillation. The data science team wants to retain data indefinitely to train future models. The medical team requires 8-year retention for clinical records. Which UK GDPR Article 5 principles are most directly in tension with indefinite retention for model training, and how should the conflict be resolved?

A freelance security consultant is engaged to test a fintech company's mobile banking API at api.bankname.com. During testing, the consultant discovers that api.bankname.com proxies requests to a third-party credit check provider at api.creditcheck-vendor.com. The consultant wants to send SQL injection payloads to the credit check API. Is this within scope, and what is the legal position under the Computer Misuse Act 1990?

A B2B SaaS company processes personal data of EU and UK data subjects under contracts with enterprise customers. The CTO proposes pseudonymising all personal data at rest: replacing customer identifiers with opaque tokens before storing in the analytics warehouse, keeping the mapping table in a separate database. The CTO argues this removes all GDPR obligations because the analytics data is anonymised. Is the CTO correct, and what is the accurate legal position?

Loading interactive component...
Privacy design documentation showing trust-building principles alongside regulatory compliance requirements
Privacy is not just a legal requirement. It is a design principle that builds trust with users and reduces regulatory risk.

Key takeaways

  • UK GDPR Article 5 imposes seven binding data protection principles on all processing of personal data. The integrity and confidentiality principle (5(1)(f)) directly requires appropriate technical security measures against unauthorised processing and accidental loss.
  • Article 25 requires privacy to be built into system design by default. PETs such as pseudonymisation, differential privacy, and tokenisation implement this at the technical layer. The NHS COVID-19 app is a documented case of privacy by design trade-off analysis.
  • The Computer Misuse Act 1990 criminalises unauthorised computer access. Security testing without explicit written authorisation from the system owner is potentially unlawful regardless of intent or outcome. Bug bounty scope boundaries are legally binding.
  • A well-designed bug bounty programme provides safe harbour language, clear scope, defined response SLAs, and a reward structure. Responsible disclosure gives organisations 90 days to patch before public disclosure.
  • GDPR accountability (Article 5(2)) requires documentary evidence. Security audit logs and GDPR access records can and should be unified in a single structured logging pipeline, reducing duplication and ensuring consistency for ICO investigations.

You now understand the legal and ethical framework that governs security practice. The final technical module examines the non-functional requirements that determine whether a system can be secured at all. What are system quality attributes, and why do they constrain every security decision? Module 24 covers system quality attributes.

Standards and sources cited in this module

  1. UK GDPR: Articles 5 and 25

    Article 5: seven data protection principles. Article 25: privacy by design and by default requirements.

  2. ICO British Airways Monetary Penalty Notice (October 2020)

    GDPR Article 5(1)(f) enforcement: £20 million fine for failure to implement adequate technical security measures.

  3. NIST Privacy Framework 1.0

    Privacy risk management framework: Protect-P function and privacy control mapping.

  4. Computer Misuse Act 1990 (UK legislation)

    Sections 1, 2, and 3: unauthorised access, access with intent, and unauthorised acts offences.

  5. ISO/IEC 29100:2011: Privacy Framework

    International privacy engineering principles and PET categories mapped to privacy requirements.