This week marks a significant moment in the UK’s evolving online safety landscape. The Crime and Policing Bill has now passed its final stages in Parliament and awaits Royal Assent – a formality that will bring the legislation into force.
Once enacted, the Bill will introduce new criminal prohibitions on certain categories of pornographic content, alongside strengthened requirements around performer consent. For platforms, producers, and distributors, this represents another step toward a more structured regulatory environment with clearer expectations – with safeguarding children and protecting users at its core.
What the Bill covers
The legislation expands existing criminal law to include several new categories of prohibited content. In particular, it will make it an offence to possess or publish pornographic material depicting:
- Strangulation or suffocation (“choking”) in a sexual context
- Adults portraying children, including “age-play” or “barely legal” scenarios
- Incest involving blood relatives
- “Step” familial content, where one performer is portrayed as a child
Alongside this, the Bill lays the groundwork for strengthened requirements around the verification of performer age and consent, with implementation details to follow.
Notably, proposals to introduce a legal right for performers to withdraw consent from previously published content were not included in the final legislation. This remains distinct from non-consensual intimate image abuse, which is already illegal where consent has never been given, but highlights an area where expectations may continue to evolve – particularly around the lifecycle management of content and performer rights.
Understanding how these provisions will be applied
One of the defining characteristics of this legislation is the breadth of interpretation likely to be applied – particularly in relation to strangulation or suffocation.
Drawing on existing UK “extreme pornography” frameworks, enforcement is expected to focus less on technical definitions and more on how content is portrayed and perceived. This means:
- Simulated acts may still fall within scope
- Consent or absence of injury may not be determinative
- Even brief or stylised depictions could be captured, depending on context
In practice, these point toward a conservative approach to compliance, where platforms will need to assess not just intent, but presentation and potential impact.
What happens next?
While the Bill sets the legal foundation, implementation details will follow.
Ofcom is expected to provide further guidance on:
- Enforcement timelines
- Compliance expectations for platforms
- How these new prohibitions interact with existing duties under the Online Safety Act
As with previous regulatory developments, there will likely be a period of clarification as industry and the regulator align on interpretation and practical application.
A continued shift in regulatory direction
These changes sit within a broader trajectory: the UK is moving beyond traditional thresholds of illegality and toward a framework that increasingly considers societal harm, perception, and the risk of normalisation – particularly where content may influence or expose younger audiences.
For the adult sector in particular, this signals:
- Expansion of criminal liability into previously lawful categories
- Greater scrutiny on how content is framed, not just what it depicts
- Increased expectations on platforms to act proactively, not reactively – with a clear responsibility to safeguard children and protect users from harm
Where technology comes in
As regulatory expectations evolve, so too does the role of technology in enabling compliance.
Meeting these new requirements at scale will depend on systems that can:
- Detect and moderate prohibited content categories with nuance and context
- Verify and manage performer consent across production and distribution workflows
- Apply jurisdiction-specific controls, ensuring content is handled appropriately for UK users particularly where there is risk of exposure to underage audiences
Crucially, the above measures sit alongside the expectation for platforms to implement highly effective age assurance when hosting or publishing pornographic content – forming part of a holistic, joined-up approach to safeguarding children online.
Importantly, these are not separate challenges. Content moderation and consent management need to work hand in hand – combining detection, verification, and auditability into a cohesive approach that not only meets regulatory expectations, but actively compliments the safeguarding of children and users.
The direction of travel is clear: regulation is becoming more detailed, more dynamic, and more enforcement-led – driven by a shared objective to reduce harm and better protect those most at risk online. The good news is that the technology to support this shift is already here, and will be critical in helping platforms navigate what comes next.