Dangerous shifts in the rapidly evolving world of online child sexual abuse  

woman in red long sleeve shirt

As new threats continue to emerge, the rapidly evolving world of online child sexual abuse demands more urgent and global action

In the digital age, technology has revolutionised the way we connect and communicate. Yet, alongside its benefits, it has also fuelled a deeply disturbing crisis: the rise of online child sexual abuse and exploitation. This problem is no longer confined to shadowy corners of the internet. It is growing rapidly, evolving in both scale and severity.  

If you’re not already alarmed, you should be. The threats facing children online are increasing and becoming more extreme. Take the recent warning from the Australian Federal Police: sadistic online groups are now targeting children as young as 12 on social media, coercing them into producing explicit content. Once the material is created, offenders extort their victims by threatening to share the images with family or friends unless more content is provided. This vicious cycle often escalates into ever more degrading and violent demands including specific live sex acts, animal cruelty, serious self-harm, and live online suicide.  

Or recent research from the International Policing and Public Protection Research Institute (IPPPRI) which uncovered a worrying trend: offenders are increasingly turning to artificial intelligence (AI) to create child sexual abuse material.  

These online predators are teaching themselves how to use AI to generate child sexual abuse material using freely available resources shared in dark web forums.  As the technology advances, they are moving to more graphic and extreme content.  

Why this matters now more than ever 

Some might argue that AI-generated abuse material is not “real,” but this is dangerously simplistic. AI makes it possible to create child sexual abuse material on an industrial scale, and as it proliferates, so does the risk of normalising and escalating abuse.  

Research from the Internet Watch Foundation (IWF) suggests most AI CSAM found is now realistic enough to be treated as ‘real’ CSAM. The most convincing AI CSAM is visually indistinguishable from real CSAM, even for trained IWF analysts.  

The evolution of this technology is happening at such a pace that regulatory frameworks and law enforcement are struggling to keep up. Encryption, designed to protect user privacy, is often misused to shield criminal activity, making it harder for authorities to track down offenders. Right now, the balance between privacy and child protection is skewed, and it’s children who are paying the price. 

This abuse leaves lifelong scars on its victims. Beyond the immediate trauma, survivors often struggle with depression, PTSD, and long-term emotional damage. The continued circulation of their abuse online means they are re-victimised again and again. For many survivors, knowing that these images or videos will exist indefinitely is a source of ongoing distress. 

What must be done: a call to action 

The escalating severity of online child sexual abuse and exploitation demands an urgent, coordinated response from governments, tech companies, law enforcement, and civil society. Here’s what needs to happen: 

  1. Strengthen legislation, regulation and international cooperation: Governments must pass and enforce stronger laws to hold offenders to account. This is a global crisis which cuts across borders – international partnerships between governments and law enforcement agencies can help dismantle these online criminal networks. Independent oversight of technology platforms by regulators is critical to ensure they are doing enough to protect children and deter offenders.  
  1. Increase accountability for tech companies: Social media platforms and messaging apps must take greater responsibility for the content shared on their services. This means investing in safety by design, in AI-driven tools to detect and remove CSAM before it spreads, and cooperating with law enforcement in cases of criminal activity. 
  1. Improve reporting mechanisms: Many victims or witnesses don’t know how to report abuse, get images taken down, or fear their reports won’t be taken seriously. Governments, NGOs and tech companies must create user-friendly reporting tools and ensure that victims receive the support they need. 
  1. Expand support services for survivors: More resources are needed to provide long-term support for survivors of child sexual abuse, including counselling, legal assistance and other social services.  
  1. Raise public awareness and education: Educating parents, teachers and children about the risks of online exploitation can help stop abuse before it happens. Schools should also teach digital literacy, focusing on safe internet practices and online risks.  

As we prepare for the first ever Global Ministerial on Ending Violence Against Children in Colombia this November, the worldwide growth of violence online is a key area which must be discussed by Governments.  

Online child sexual abuse and exploitation is an issue we have the power to address—if we move swiftly and decisively. Governments hold a critical key to bringing together tech and civil society solutions to protect the most vulnerable among us.  

The future depends on what we do next, and the time for action is now. Children deserve nothing less.  

Did you find this content helpful?
YesNo