A Duty of Care: Why it’s time to Enhance Age Assurance and Content Moderation to Protect Children from Harmful Online Content

A Duty of Care: Why it’s time to Enhance Age Assurance and Content Moderation to Protect Children from Harmful Online Content

Follow VerifyMy on LinkedIn to keep up to date with the latest safety tech and regulatory developments.

In today’s digital-first environment, children are being routinely exposed to harmful, age-restricted and even illegal content online. The continued proliferation of smart devices and easy, anywhere access to user-generated content (UGC) through social media and chat platforms is only exacerbating the issue. 

At the tap of a button, individuals can now be exposed to more adult, extreme and illegal content than ever before. And in the UK, it has been revealed that children themselves are now the biggest perpetrators of sexual abuse against children (52%). 

The UK’s National Police Chiefs’ Council (NPCC) has directly correlated this trend with the ease of access young people have to the internet via smartphones. While the IWF has revealed that the development of new technologies such as artificial intelligence (AI) will only exacerbate the creation, distribution and ease of access to illegal content such as Child Sexual Abuse Material (CSAM) further.

Safer Internet Day brings together thousands of young people, parents and organisations each year to raise awareness of how children can stay safe online. It provides an opportunity to bring the conversation and enforcement of safer practices to the mainstream, helping to protect children from inappropriate or illegal content and visiting the dark web.

This year, it must also become a springboard for businesses to enhance their online age assurance and content moderation to ensure they are doing their utmost to protect the young and vulnerable. 

Ease of Access in an online world

While the internet’s easy accessibility has brought numerous benefits, the truth is that as a result of its rapid, and often unregulated development, children now have unprecedented access to age-restricted and illegal online content.

This is naturally a leading concern for parents, with research from Ofcom revealing that 75% worry about their children seeing inappropriate content when online. 73% specified this as adult or sexual content. 

Not only is the viewing of this content a horrific occurrence in itself, but it also leads to a detrimental impact on children’s longer-term wellbeing and mental health, as well as warping young people’s views on sex and appropriate behaviour.

At the same time many social media sites, where age-restricted or even illegal content can be shared over chat or feed functions, permit users to create accounts from just 13 years of age. Coupled with the proliferation of smart devices and their camera and video capabilities. It makes it incredibly easy to produce, upload and consume content within seconds.

The digital world therefore needs to catch up with the offline world, where viewing an individual's documentation and physical appearance is the norm, making it far easier and practical for authorities to restrict them from accessing age-restricted products and content. 

Age Assurance and Content Moderation

To overcome these challenges, organisations must rapidly deploy and enhance their age assurance and content moderation infrastructure. The advent of technologies such as AI has made it far easier and more practical than ever before to identify and stamp out illegal content online at scale, accurately and at a low cost. 

With regulators having been historically slow to adapt to tech innovation, some organisations have either done the bare minimum, or even turned a blind eye altogether. But the protection of our young people is at stake and therefore the onus must now fall on business leaders, tech companies and the entire ecosystem that enables commerce to foster a culture of responsibility, prioritising the safety and well-being of their users in their online practices.

With a plethora of age assurance solutions such as email address age estimation now out there, businesses have the tools at their disposal to verify the age of customers with minimal friction. At the same time, content moderation tools now enable the analysis of uploaded or live-streamed content in real-time, before it is published, providing instant solutions that flag or remove illegal material. 

Furthermore, the implementation of proactive solutions such as uploader and participant verification facilitate consent and reduce the risk of revenge porn, intimate image abuse, exploitation, slavery and sex trafficking. 

Safety and Privacy

Despite these available tools, work still needs to be done to protect children online and remove illegal content from sites. This is especially important as a recent survey in the US revealed that 56% of people who come across CSAM content are unlikely to report it further. 

This challenge is often exacerbated by the ongoing safety and privacy debate, where technology companies and social media firms state the importance of encryption in keeping user data safe. The problem with encryption, however, is that it can be abused by bad actors to distribute and circulate age-restricted or illegal content online. 

This is a debate which will only grow in prominence over the next few months, especially now that the UK’s Online Safety Act has come into law. Fortunately, as new privacy preserving authentication tools emerge, such as email address, organisations will be able to mitigate this concern. This is something which was specifically called out in recent guidance from the Information Commissioner's Office (ICO), the body that regulates both privacy (GDPR in the UK) and online content.

While there will likely be pushback against any legislation that could be viewed as compromising user privacy, nothing however, should be considered more important than the protection of young and vulnerable people.

It’s only logical therefore that while the regulators push out these new robust laws, and social media and technology firms adapt to the changes, businesses become responsible for their own actions. This means taking the lead on implementing age assurance and content moderation tools that drive meaningful change across their industries. Safer Internet Day provides the perfect opportunity to start doing this.

Michal Karnibad, Co-CEO, VerifyMy