Safer Internet Day

New research, launched on Safer Internet Day, reveals that shockingly almost one in ten (9%) young people have come across Child Sexual Abuse Material (CSAM) on the internet. This highlights the prevalence of harmful and illegal content that demands immediate action. Findings also show that children as young as eight years old have viewed or […]

3 min read
February 6, 2024
Share this article:

New research, launched on Safer Internet Day, reveals that shockingly almost one in ten (9%) young people have come across Child Sexual Abuse Material (CSAM) on the internet. This highlights the prevalence of harmful and illegal content that demands immediate action.

Findings also show that children as young as eight years old have viewed or purchased age-restricted (18+) content or products, including Alcohol or Pornography, which are clearly harmful and illegal content for such young audiences.

 Social media channels account for over a third (35%) of all encounters with content young people would consider unsafe, age-restricted (18+), or illegal.

Exposed to illegal content

Almost one in ten (9%) young people across the UK and the US have been exposed to illegal content, including  Child Sexual Abuse Material (CSAM), when going online. This is one of the distressing findings from research launched by online safeguarding tech leader Verifymy on Safer Internet Day. The survey, which interviewed 2,000 individuals aged 16-19 from the UK and the US, also reveals that young people encounter content they would consider unsafe, is age-restricted (18+) or illegal within 10 minutes of going online. Almost a third (29%) report seeing content related to self-harm, while a quarter (25%) have witnessed extreme violence, and over one in ten (12%) have been exposed to extreme pornography. The research found that only half of respondents (49%) encountered safety measures that should be in place to restrict their access to 18+ content. Of these, 53% report having been asked to self-declare their date of birth, 32% have encountered facial age estimation, and 27% have been asked to provide some form of identification, such as a passport or driving license. 

A pervasive issue

Launched at a time where parents are growing more concerned about their children seeing inappropriate content, the research provides startling insights into the frequency with which young people encounter age restricted (18+), unsafe, or illegal materials online. Over one in ten (11%) report encountering such content once a week, while 10% experience it 2-3 days a week and 7% are exposed to it every single day they go online. This suggests harmful and illegal content is a pervasive issue that needs urgent attention.

Regarding the origin of such content, over a third (35%) of respondents indicated that their encounters took place via social media platforms, which the NSPCC also cites as contributing to an increase in grooming crimes. Almost one in five (19%) have been exposed to materials shared by strangers, 14% by someone they consider a friend, and 13% by someone they interacted with online through channels such as online gaming or chat. 

Harmful and illegal content continues to be underreported

Despite respondents revealing that their schools/teachers (62%) and parents/caregivers (48%) have taken the time to teach them about online safety, young people are unsure what they should do if they encounter materials they identify as unsafe, restricted by age or illegal. While 36% report the content to the website where it was found, 38% simply close the site, 35% ignore the material and only 11% inform their parents or caregivers, with a mere 9% reporting it to the authorities. Harmful and illegal content continues to be underreported despite its significant impact.

The short and long-term effects of viewing this content was also found to have a detrimental impact on young people’s wellbeing, with 69% stating it has a negative impact. Additionally, 44% of respondents reported a direct impact on their mental health. 

“At the click of the button, our young people can be exposed to age-inappropriate content or even the most horrendous online content imaginable. Despite the best efforts of websites and platforms, schools, parents, caregivers and awareness days to guide online best practice, our findings show more needs to be done. 

However, rather than pointing fingers now is the time to act and implement pragmatic solutions to solve the issue of how we best protect children online. Businesses should be engaging and partnering with subject matter experts in this area – including regulators and safety tech providers. This way we can build an ecosystem of effective solutions that when implemented truly provide protection to young people when they go online.

Websites must ensure they have robust content moderation technology in place which can identify and remove any illegal material before it is published. At the same time, they must invest in age assurance technologies to ensure those accessing their platforms are the correct age and only see age-appropriate content.”

Michal Karnibad Co-CEO Verifymy

Methodology 

For this research 2,000 consumers aged between 16 and 19 were interviewed across the UK and US (1,000 per market) in January 2024. 

The research has been launched on Safer Internet Day, an awareness event that brings together young people, parents and organisations to drive better online safety practices. Verifymy is using the findings as a call to action, hoping they act as a springboard for businesses to enhance their age assurance and content moderation capabilities. This includes turning to new effective privacy-preserving technologies such as email address age estimation which can be easily implemented, cost-effectively and with minimal friction.

About Verifymy

Verifymy creates safety tech solutions designed to safeguard children and society online, by providing frictionless, trustworthy assurance, identity authentication and content moderation solutions for online platforms to maintain their integrity, protect their reputation and safeguard their customers. For more information please visit: https://verifymy.io/.

About the author

Verifymy

Verifymy is a safety technology provider on a mission to safeguard children and society online.

Subscribe and keep up to date

Related articles

Age verification

On Friday, French regulator Arcom announced that pornographic websites with French users will be required to implement age verification systems by January 11th, 2025. This requirement also extends to other user-to-user platforms that allow access to adult content, as outlined in the newly released technical reference guide for verification. Arcom was tasked with creating a […]

3 min read
October 23, 2024
Age verification

Regulators often struggle to keep pace with innovation and technological change. This is not due to ignorance or incompetence but mostly because their powers are generally derived from statutes which are infrequently, if ever, updated. In the field of age assurance, there are not even many laws in place to empower regulators to take any […]

3 min read
September 5, 2024
Age verification

For many online sites, platforms or businesses that sell or host age-restricted products and content, accurately determining the age of their users online is critical. Age assurance technologies offer a quick and accurate solution for determining user ages, ensuring compliance with age-related regulations and minimising disruption to the user experience. Moreover, proportionality in age assurance is […]

4 min read
August 7, 2024
our solutions
industries
Company
resources