Digital Safeguarding
In this blog post we explore why robust age verification will become the new cornerstone of digital safeguarding and help to create safer online experiences for young people.
3 min read
June 7, 2021
Share this article:

Last week, this BBC investigation revealed how an oversight in the age verification solution used by OnlyFans, opened up a route for underage users to be able to sell and appear in explicit videos on the site. In this blog post we explore why robust age verification will become the new cornerstone of digital safeguarding and help to create safer online experiences for young people.

Transforming internet culture

OnlyFans gained heightened popularity with sex workers and other adult content creators who were left otherwise unemployed by the COVID-19 pandemic. The Essex-based content subscription service allows its users (creators) to earn money by uploading exclusive content for their fans to view as part of a subscription model. While the site has transformed the way in which people consume content online, they have been under scrutiny for recent failures to correctly identify underage users. In a number of instances, girls as young as 13 were reported to have used IDs belonging to older relatives to sign up and share content.

OnlyFans online age verification system

Creators who want to set up an account on the site are asked to prove their identity by providing a ‘selfie’ which includes them holding up a form ID to the camera. According to the OnlyFans terms of service, users must be 18 years or older. However, in one particular case, a 14-year-old girl was able to use her grandmother’s passport and bank details to create an account on the site, highlighting an oversight in the site’s age verification technology to identify the age gap between her selfie and the passport photo.

Age verification law – moving on from the Digital Economy Act

Currently going through Parliament, is the draft version of a new Online Safety Bill, a proposed follow up to the previous Digital Economy Act 2017. Part 3 of the Act included a requirement to protect children from online pornography, but the Government announced in October 2019 that it would not commence Part 3 (which contained the age verification mandate), and now Clause 131 of the Online Safety Bill, completely repeals this.

OFCOM warns of huge fines and criminal sanctions

When published, the Online Safety Bill will require websites hosting user-generated content online to assess the likelihood of children accessing their services, and if so, task them with providing additional protections, e.g. age verification software. OFCOM, the media and communications regulator, will have the power to fine companies up to 10% of their “qualifying worldwide revenue” or £18 million (whichever is higher). The draft bill also sets out possible criminal offences for individuals and senior managers of services which do not comply, although the Government has said it intends to hold these in reserve.

The AVMSD – Rolling out the European standard for age verification

The EU is already setting the tone with the Audio-Visual Media Services Directive (AVMSD). Brought into European law in September 2020, the AVMSD requires video service providers in Member States to ensure the protection of minors from “content and advertising that may impair their physical, mental or moral development”. The Commission is already taking action against 23 Member States for failing to implement the Directive into national law by last autumn’s deadline. As age verification becomes the foundation of internet safety across Europe, we’re likely to see even more bold regulatory enforcement.

How does age verification work in a best-case scenario

As technology continues to evolve, there are now many ways to identify and age-verify internet users, which could certainly help to reduce incidences of under-18-year-olds accessing unsuitable content. At Verifymy, we offer several different age verification methods including third-party database checks, AI-powered age estimation, mobile phone verification as well as government ID scans. Having verified millions of transactions through the most used eCommerce platforms and marketplaces, our software allows businesses to frictionlessly age verify their customers and users. Furthermore, as specialists in age verification and compliance, we utilise methodologies independently certified as meeting the requirements of PAS 1296:2018 – Code of Practice for Online Age Checking.

Regulatory concerns for adult content providers

In a recent Verifymy survey, 81% of businesses who provide content online said that their main priority for strengthening their age verification processes is brand reputation. Providers want regulation that delivers a level playing field across the industry and a landscape that is actively policed to ensure this is the case. The survey also highlighted that fines or potential criminal sanctions for non-compliance should be proportionally enforced, no matter the size of the provider.

Digital safeguarding

As internet accessibility expands and children consume even more content online, the safeguarding conversation will continue down a digital path. Policymakers will focus on facilitating safe online experiences and promoting the welfare of children. Companies operating in the adult sector, especially platforms like OnlyFans, will need to take big steps to ensure their age verification processes are robust enough to avoid the reputational damage and financial penalties associated with non-compliance.

More information on our age verification solution can be found here. Alternatively, click here to speak to a member of our sales team.

About the author

Verifymy

Verifymy is a safety technology provider on a mission to safeguard children and society online.

Subscribe and keep up to date

Related articles

Digital safeguarding

What is non-consensual intimate image abuse? Non-consensual intimate image (NCII) abuse sharing, also known as “image-based sexual abuse”, is the unauthorised sharing, distribution, or publishing of explicit images or videos of a person without their permission. It is not only a severe violation of personal privacy but also a form of harassment that can lead […]

4 min read
November 8, 2024
Regulation

In a bold move to safeguard children online, Australia has announced plans to ban social media access for those under 16  Platforms like Facebook, Instagram, X (formerly Twitter), and TikTok will be required to prevent children from creating or accessing accounts, even if they have parental consent or already have active profiles. The law, expected […]

3 min read
November 8, 2024
Age verification

On Friday, French regulator Arcom announced that pornographic websites with French users will be required to implement age verification systems by January 11th, 2025. This requirement also extends to other user-to-user platforms that allow access to adult content, as outlined in the newly released technical reference guide for verification. Arcom was tasked with creating a […]

3 min read
October 23, 2024
our solutions
industries
Company
resources