At Verifymy, we care deeply about making the internet a safer place – especially for those who are most at risk. We work every day to help platforms build trust, protect their communities, and reduce the real-world impact of harmful content, particularly the kind that targets women and girls. That’s why we welcomed the opportunity to respond to Ofcom’s consultation on its draft guidance, A Safer Life Online for Women and Girls.
Our submission supports Ofcom’s direction and ambition, while offering practical insights on how the guidance can be made more effective for services of all sizes and risk profiles.
1. Clearer expectations on harmful content
We welcome Ofcom’s focus on four key types of harm – online misogyny, pile-ons, online domestic abuse, and image-based sexual abuse. However, we recommended that the guidance more clearly highlight that intimate image abuse is illegal under UK law, including in the form of deepfakes. Services need to understand not just the harm, but the legal and ethical obligation to act swiftly.
We also noted the importance of addressing both illegal and legal-but-harmful content, especially where technologies blur the boundaries of consent. Clearer expectations, combined with practical tools like content moderation and participant consent verification, are critical to help platforms respond effectively.
2. Supporting practical implementation of the nine actions
We welcome the structured nature of the nine proposed actions and their emphasis on prevention and design-led safety. However, we raise a concern: while the actions are broad enough to allow flexibility, they may be too high-level to guide real-world implementation.
In our work across sectors, from social media to adult entertainment, we’ve seen the effectiveness of:
- Participant consent verification: Platforms can require documented consent prior to uploading explicit content, reducing risk and improving takedown response times.
- Age assurance: Accurately identifying underage users is essential to prevent exposure to harmful or exploitative content.
- Content moderation: Automated detection tools significantly reduce the time taken to detect and remove illegal content like CSAM.
We encourage Ofcom to give greater visibility to such tools and practices that are already operational, rather than treated as future aspirations.
3. Strengthening good practice guidance
The draft guidance includes useful case studies and practice examples, particularly around hash matching, consent nudging, and uploader verification. But we believe it needs to go further in specifying:
- What “good” implementation looks like;
- How to assess the effectiveness of safety practices (e.g., user feedback, removal times);
- How to distinguish between emerging innovations and widely used, effective solutions.
To prevent uneven adoption, we also suggested that services explain when and why they diverge from good practice – introducing a more transparent and accountable culture.
4. Encouraging adoption through transparency and recognition
Recognising the challenge Ofcom faces in promoting non-mandatory guidance, we support the proposal to publish assessments of how platforms are addressing the safety of women and girls. We believe this can drive accountability and improvement without the need for new legislation.
We suggest that Ofcom could take this further by:
- Outlining clear criteria for these assessments;
- Highlighting positive examples and innovations;
- Supporting proportional implementation based on platform size and risk;
- Considering a recognition scheme or kite mark for platforms meeting certain standards;
- Publishing interim six-month updates instead of waiting 18 months for a progress report.
5. Privacy and rights: Not a trade-off
Finally, we emphasise that effective safety and robust privacy protections are not mutually exclusive. In fact, they must go hand-in-hand if platforms are to build and maintain trust. We support a privacy-by-design approach that ensures user rights are protected even as stronger safety measures are implemented.
Done well, safety and privacy reinforce each other: users feel safer in environments where harmful content is tackled decisively, and they feel more empowered when they understand and control how their data is handled. Future Ofcom guidance should highlight examples of privacy-preserving safety innovation.
Conclusion
Ofcom’s guidance is a crucial step toward safer digital spaces for women and girls. At Verifymy, we support its ambition and offer proven tools to help platforms meet these expectations today. With clearer guidance, practical support, and transparent accountability, the sector can make real progress in reducing online harms.
We are proud to contribute to this conversation and ready to support platforms in turning principles into action.
If you would like to learn more, please get in touch.