Young People Encounter Harmful, Age-Restricted and Illegal Content Within Minutes of Going Online

Young People Encounter Harmful, Age-Restricted and Illegal Content Within Minutes of Going Online

Follow VerifyMy on LinkedIn to keep up to date with the latest safety tech and regulatory developments.

New research, launched on Safer Internet Day, reveals that shockingly almost one in ten (9%) young people have come across Child Sexual Abuse Material (CSAM) on the internet.
Findings also show that children as young as eight years old have viewed or purchased age restricted (18+) content or products, including Alcohol or Pornography.
 Social media channels account for over a third (35%) of all encounters with content young people would consider unsafe, is age restricted (18+) or illegal .

Almost one in ten (9%) young people across the UK and the US have been exposed to illegal content including  Child Sexual Abuse Material (CSAM) when going online. This is one of the distressing findings from research launched by online safeguarding tech leader VerifyMy on Safer Internet Day.The survey, which interviewed 2,000 individuals aged 16-19 from the UK and the US, also reveals that young people encounter content they would consider unsafe, is age-restricted (18+) or illegal within 10 minutes of going online. Almost a third (29%) report seeing content related to self-harm, while a quarter (25%) have witnessed extreme violence, and over one in ten (12%) have been exposed to extreme pornography.The research found that only half of respondents (49%) encountered safety measures that should be in place to restrict their access to 18+ content. Of these, 53% report having been asked to self-declare their date of birth, 32% have encountered facial age estimation, and 27% have been asked to provide some form of identification, such as a passport or driving license. 

Launched at a time where parents are growing more concerned about their children seeing inappropriate content, the research provides startling insights into the frequency with which young people encounter age restricted (18+), unsafe, or illegal materials online. Over one in ten (11%) report encountering such content once a week, while 10% experience it 2-3 days a week and 7% are exposed to it every single day they go online.

Regarding the origin of such content, over a third (35%) of respondents indicated that their encounters took place via social media platforms, which the NSPCC also cites as contributing to an increase in grooming crimes. Almost one in five (19%) have been exposed to materials shared by strangers, 14% by someone they consider a friend, and 13% by someone they interacted with online through channels such as online gaming or chat. 

Despite respondents revealing that their schools/teachers (62%) and parents/caregivers (48%) have taken the time to teach them about online safety, young people are unsure what they should do if they encounter materials they identify as unsafe, restricted by age or illegal. While 36% report the content to the website where it was found, 38% simply close the site, 35% ignore the material and only 11% inform their parents or caregivers, with a mere 9% reporting it to the authorities. 

The short and long-term effects of viewing this content was also found to have a detrimental impact on young people’s wellbeing, with 69% stating it has a negative impact. Additionally, 44% of respondents reported a direct impact on their mental health. 

“At the click of the button, our young people can be exposed to age-inappropriate content or even the most horrendous online content imaginable. Despite the best efforts of websites and platforms, schools, parents, caregivers and awareness days to guide online best practice, our findings show more needs to be done. 

However, rather than pointing fingers now is the time to act and implement pragmatic solutions to solve the issue of how we best protect children online. Businesses should be engaging and partnering with subject matter experts in this area - including regulators and safety tech providers. This way we can build an ecosystem of effective solutions that when implemented truly provide protection to young people when they go online.

Websites must ensure they have robust content moderation technology in place which can identify and remove any illegal material before it is published. At the same time, they must invest in age assurance technologies to ensure those accessing their platforms are the correct age and only see age-appropriate content.”


Michal Karnibad Co-CEO VerifyMy

Methodology 

For this research 2,000 consumers aged between 16 and 19 were interviewed across the UK and US (1,000 per market) in January 2024. 

The research has been launched on Safer Internet Day, an awareness event that brings together young people, parents and organisations to drive better online safety practices. VerifyMy is using the findings as a call to action, hoping they act as a springboard for businesses to enhance their age assurance and content moderation capabilities. This includes turning to new effective privacy-preserving technologies such as email address age estimation which can be easily implemented, cost-effectively and with minimal friction.

About VerifyMy

VerifyMy creates safety tech solutions designed to safeguard children and society online, by providing frictionless, trustworthy assurance, identity authentication and content moderation solutions for online platforms to maintain their integrity, protect their reputation and safeguard their customers. For more information please visit: https://verifymy.io/.