Before I touch a keyboard, I ask one question: “Could I justify this to a lawyer
tomorrow?” If the answer is anything other than an easy yes, the test isn’t ready. Below is my practical crib sheet of UK legislation that governs how we plan and execute security testing. Short, plain-English summaries, why each law matters on an engagement, and a recent real-world hook.
Computer Misuse Act 1990 (CMA)
What it is: The backbone of UK cybercrime law. It criminalises unauthorised access and interference with computer systems—even if the intent is “helpful”.
Why it matters in testing: You must have explicit, written authorisation for everything in scope. No scope, no touch. Keep signed rules of engagement and an evidence trail.
Recent example: In late 2023 a member of the LAPSUS$ group, Arion Kurtaj, was convicted for multiple Computer Misuse offences and related crimes, highlighting that “I was just testing” isn’t a defence without authorisation (City of London Police).
Police and Justice Act 2006 (amendments to CMA)
What it is: This Act strengthened the CMA—explicitly criminalising denial-of-service and broadening “unauthorised acts” against computers.
Why it matters in testing: Load and stress tests can stray into DoS territory. If a client wants resilience testing, make sure the method, timing, and thresholds are written into scope and approved.
Recent example: DoS remains a clear offence without permission; the 2006 changes are routinely cited by prosecutors when service disruption is caused.
General Data Protection Regulation (GDPR – UK version)
What it is: GDPR sets the rules for collecting, processing, and protecting personal data across Europe. Since Brexit, the UK has its own version known as “UK GDPR” alongside the Data Protection Act 2018.
Why it matters in testing: Pen testers often see personal data in live environments. You are legally bound to minimise exposure, use data responsibly, and secure any artefacts you collect. Mishandling test data could trigger hefty fines for your client and liability for you.
Recent example: In 2023, TikTok was fined £12.7m by the UK ICO for misusing children’s data—illustrating that GDPR enforcement is alive and well (ICO).
Data Protection Act 2018
What it is: The UK’s implementation of GDPR. It creates criminal offences (e.g., s.170) for unlawfully obtaining or disclosing personal data, and lays out how GDPR applies locally.
Why it matters in testing: Even if a client consents, testers must treat data responsibly—secure handling, storage, and destruction are a must.
Recent example: 2024 prosecutions under s.170 saw individuals fined for unauthorised access to customer records.
Human Rights Act 1998 (Article 8 – Privacy)
What it is: Protects the right to private and family life. While aimed at state actions, privacy principles influence how organisations (and their testers) should operate.
Why it matters in testing: Design tests to avoid excessive intrusion into individuals’ private data. Keep activity proportionate and documented.
Recent example: Courts continue to scrutinise investigatory practices where privacy is engaged; Article 8 is regularly argued alongside data protection and interception claims.
Police and Criminal Evidence Act 1984 (PACE)
What it is: Sets police powers and evidence rules (search, seizure, interviews). It underpins chain-of-custody and admissibility standards.
Why it matters in testing: If you handle client evidence (e.g., insider threat reviews or sensitive logs), mirror PACE-style practices: seal, label, record handlers, and preserve integrity.
Recent example: In July 2024 a case was referred due to a possible breach of PACE procedures, with the court considering exclusion of evidence under s.78.
Regulation of Investigatory Powers Act (RIPA) 2000
What it is: The older framework governing interception and surveillance. It restricts interception of communications without lawful authority.
Why it matters in testing: Do not intercept live communications in tests (email, VoIP, network traffic) unless the client has lawful authority and the mechanism is explicitly scoped and approved.
Recent example: RIPA remains cited in prosecutions relating to unlawful interception; its principles now sit alongside the 2016 Act.
Investigatory Powers Act 2016 (“Snooper’s Charter”)
What it is: Modernised interception and communications data powers. Section 3 creates an offence of unlawful interception; the Act also sets general privacy duties.
Why it matters in testing: Red teams must avoid ad-hoc packet capture of third-party communications on live links. Use synthetic traffic or properly authorised capture on controlled segments.
Recent example: The Act is actively enforced and provides for penalties around unlawful interception.
Online Safety Act 2023
What it is: Regulates user-to-user and search services, imposing duties to mitigate illegal and harmful content. Ofcom is the regulator; implementation is ongoing.
Why it matters in testing: If you test platforms with UGC, flaws that bypass safety controls can expose clients to serious liability.
Recent example: In August 2025 the High Court dismissed Wikimedia’s challenge to aspects of the regime, underlining the trajectory of enforcement.
On the Horizon: Cyber Security & Resilience Bill
What it is: A forthcoming Bill to modernise the UK’s cyber regime—updating NIS and likely expanding duties to managed service providers and data centres.
Why it matters in testing: Expect more clients—especially MSPs and hosting providers—to require assurance against statutory duties. Build compliance-ready test plans now.
Recent context: Announced against a backdrop of supply-chain incidents; keep an eye on scope rules and incident reporting once the Bill lands.
Quick Reference
- Always get written authorisation and clear scope.
- Minimise and protect personal data at all times.
- Avoid live interception without lawful authority.
- Follow chain-of-custody practices for sensitive artefacts.
- Document decisions. If it’s not written down, it didn’t happen.