Home / Resources / Regulation / UK / The Online Safety Act
The Online Safety Act is a new law designed to protect both children and adults online. It places a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms. The Act mandates that companies implement systems and processes to reduce the risk of underage users accessing their services, prevent illegal activity and take down illegal content when it appears.
The Act became law in 2023, with the first day of implementation on 16th December 2024.
The Act’s strongest protections focus on children. Platforms must prevent minors from accessing harmful or inappropriate content and provide clear and accessible mechanisms for parents and children to report problems online when they do arise.
For adults, the Act enhances transparency by requiring major platforms to disclose the types of content they allow, remove illegal content when it appears and provide users with greater control over what they see.
Ofcom, the UK’s communications regulator, will oversee compliance with the Act. It will issue codes of practice detailing how providers can meet their safety obligations and enforce regulations proportionate to a platform’s risk level, size, and capacity. Platforms are also required to respect users’ rights while implementing safety measures.
The Act applies to social media platforms (widely defined as “user-to-user services”), search engines and pornographic websites. This could include:
It also extends to companies based outside of the UK if they have a UK user-base, target the UK market, or pose a material risk to UK users.
Platforms must take strong measures against illegal activities, including:
The Act categorises harmful content into:
Primary Priority Content (must be completely inaccessible to children):
Priority Content (requires age-appropriate restrictions):
The Online Safety Act became law on 26 October 2023, and work is underway to bring its protections into effect as quickly as possible. Ofcom is taking a phased approach to introducing the Act’s duties, with the government also responsible for passing secondary legislation to support the framework.
On 17 October 2024, Ofcom published an updated roadmap detailing its plans for enforcement and compliance. As part of its role, Ofcom is developing guidance and codes of practice to help online platforms meet their obligations. These codes undergo public consultation before being finalised and must be approved by Parliament before taking effect.
Protecting children from harmful content
Platforms in scope must ensure their services enforce age restrictions and offer safe, age-appropriate experiences for children.
Ofcom has outlined highly effective age assurance methods to ensure that children are protected from harmful and age-inappropriate content online. These methods help online platforms comply with their legal obligations under the Act by preventing underage users from accessing restricted content while respecting user privacy.
Ofcom’s guidance on highly effective age assurance and its implementation in practice is applied consistently across all parts of the online safety regime. In summary, Ofcom’s position:
Ofcom considers that this approach will secure the best outcomes for protecting children online during the early years of the Act being in force. While Ofcom has decided not to introduce numerical thresholds for highly effective age assurance at this stage (e.g., 99% accuracy), it acknowledges that such thresholds may complement its four criteria in the future, subject to developments in testing methodologies, industry standards, and independent validation.
Ofcom has outlined 4 key criteria to determine if an age assurance method is highly effective. The solutions implemented must be: Technically accurate, robust, reliable and fair.
Alongside traditional methods like identity document checks, Ofcom has recognised email address-based age estimation – a method pioneered by Verifymy – as highly effective.
Ofcom’s official guidance states that email-based age estimation is capable of being highly effective due to the below reasons:
Implementation and Compliance
By using these highly effective age assurance methods, online platforms can better protect children while ensuring compliance with the Online Safety Act.
Ofcom has broad enforcement powers, including:
Ofcom can take action against non-UK companies with significant UK user-bases to ensure that overseas platforms comply with safety regulations.
The Online Safety Act marks a significant shift in digital regulation. It holds platforms accountable for user safety while balancing free speech considerations. With Ofcom’s oversight, phased implementation, and strong enforcement measures, the Act aims to create a safer online environment for all users in the UK.
Verifymy can supply ‘highly effective’ age assurance, which is required by the Act to prevent minors from seeing Primary Priority Content, such as suicide, self-harm information and pornography.
We can also estimate users’ ages to protect them from other Priority Content, which includes bullying and harmful challenges. Additionally, our services include content moderation for sites to detect and remove illegal or harmful content.
Get in touch to talk to a regulatory expert and understand how to protect your business and ensure compliance.
Verifymy’s age assurance solution for any online product, service or business, features the widest range of age verification and age estimation methods to ensure the highest pass rates possible with minimal business disruption.
Back / Age verification & age estimation
Back / Identity verification & content moderation