Regulation

On November 25th 2025, Ofcom published its new guidance on improving online safety for women and girls – marking a significant shift in how platforms are expected to design, govern and operate their services. While the Online Safety Act already holds platforms legally responsible for protecting UK users from illegal content and content that is […]

3 min read
December 4, 2025
Share this article:

On November 25th 2025, Ofcom published its new guidance on improving online safety for women and girls – marking a significant shift in how platforms are expected to design, govern and operate their services. While the Online Safety Act already holds platforms legally responsible for protecting UK users from illegal content and content that is harmful to children, this new guidance goes further. It focuses specifically on the harms that disproportionately affect women and girls, recognising the unique risks they face across social media, gaming services, dating apps, discussion forums and search.

Ofcom has already set out its Codes of Practice for illegal content and harms to children. But under the Act, it is also required to produce tailored guidance for protecting women and girls and ensuring tech companies take proactive steps, not just reactive ones. As Ofcom explains, the guidance identifies nine areas where platforms are expected to take responsibility, design their services to prevent harm, and provide better support for their users.

Why this guidance matters

Women and girls experience a distinct set of online harms that are both widespread and deeply damaging. Ofcom highlights four categories in particular: misogynistic abuse, coordinated harassment, stalking and persistent unwanted contact, and image-based sexual abuse. These behaviours can often escalate quickly online, affecting users’ safety, well-being, and ability to participate online.

Dame Melanie Dawes, Ofcom’s Chief Executive, captured the severity of these harms: “When I listen to women and girls who’ve experienced online abuse, their stories are deeply shocking. Survivors describe how a single image shared without their consent shattered their sense of self and safety.”

On the day of the release, she added:
“That’s why today we are sending a clear message to tech firms to step up and act in line with our practical industry guidance, to protect their female users against the very real online risks they face today.”

The nine priority action areas

Central to the guidance is a set of nine priority areas where tech companies can and should make meaningful improvements. These represent a holistic approach, covering governance, product design, user controls, reporting processes and enforcement. ​​Ofcom categorises these recommendations into three overarching themes: taking responsibility, preventing harm, and supporting women and girls, to help platforms identify where action is needed across their systems and services.

1. Strengthen governance and accountability around gender-based harm

Leadership teams should embed responsibility for ensuring the safety of women and girls at a senior level, supported by clear policies, internal expertise, and regular oversight.

2. Conduct risk assessments focused on harms to women and girls

Platforms should systematically assess how their features, algorithms and community dynamics may expose users to specific gendered harms, ideally informed by survivor insight and expert consultation.

3. Increase transparency about online safety for women and girls

Tech companies should publish meaningful information on the prevalence of gender-based abuse and the effectiveness of their safety measures, improving accountability and public trust.

4. Carry out product testing and “abusability evaluations”

Before launching new features, platforms should test whether they could be misused for harassment, stalking, or other harms and address these risks before releasing them into the wild.

5. Set safer default settings to minimise risk

Privacy and safety settings, particularly for younger users, should default to the most protective options, reducing exposure without requiring users to self-navigate complex menus.

6. Reduce the circulation of gender-based abusive content

Platforms should proactively identify, restrict and remove harmful content, including non-consensual intimate images, using effective detection technologies and robust moderation.

7. Provide better user controls to manage and limit abuse

Users should be able to block or mute multiple accounts at once, limit unwanted contact, manage pile-ons, and take back control during episodes of targeted harassment.

8. Improve reporting routes for survivors of online abuse

Reporting systems must be clear, accessible, trauma-informed and designed specifically to support victims – particularly those experiencing domestic abuse, coercive control, or sexual exploitation online.

9. Take swift and effective action when harm occurs

Once abuse is reported, platforms should remove content quickly, sanction or ban perpetrators, prevent ban evasion, and provide affected users with meaningful support and follow-up.

These nine areas reflect a simple truth: preventing online harm requires systemic action, not individual fixes.

Government support underscores the stakes

The guidance has been welcomed by the Government. Technology Secretary Liz Kendall emphasised the urgency for platforms to act, stating:

“Tech companies have the ability and the technical tools to block and delete online misogyny. If they fail to act, they’re not just bystanders, they’re complicit in creating spaces where sexism festers and a society where abuse against women and girls becomes normalised.”

The technology exists, and the moment for deployment is now

Ofcom’s guidance sets a new benchmark for platform responsibility. But perhaps most importantly, the solutions needed to meet these expectations already exist.

From low-friction age checks to robust AI-driven content moderation and image safety and consent management tools, the technology to ensure user protection and safeguard online is available today and ready to be deployed at scale. 

Creating a safer online world for women and girls is not a question of invention; it is a question of implementation. With clear regulatory expectations and proven tools on the market, now is the moment for platforms to act.

About the author

Verifymy

Verifymy is a safety technology provider on a mission to safeguard children and society online.

Subscribe and keep up to date

Related articles

Digital safeguarding

What is non-consensual intimate image abuse? Non-consensual intimate image (NCII) abuse sharing, also known as “image-based sexual abuse”, is the unauthorised sharing, distribution, or publishing of explicit images or videos of a person without their permission. It is not only a severe violation of personal privacy but also a form of harassment that can lead […]

4 min read
November 8, 2024
Online Safety Act

Since October 2023: What we know so far The Online Safety Act (OSA) received Royal Assent in October 2023 and mandates social media platforms and search engines to take action against illegal and harmful content, with a particular focus on protecting children.  The Act mandates a “duty of care” for online service providers, particularly those […]

3 min read
October 23, 2024
Online Safety Act

The Online Safety Act has completed its passage through the UK Parliament and awaits only the King’s assent – a formality – before it becomes law. This will trigger an 18-month deadline for the newly appointed internet regulator, Ofcom, to create a series of codes of conduct which the Secretary of State will then present […]

3 min read
September 20, 2023
our solutions
industries
Company
resources