Child sexual abuse material

The issue:
Understanding Child Sexual Abuse Material (CSAM)

What is Child Sexual Abuse Material?

Child Sexual Abuse Material (CSAM) refers to sexually explicit content involving a child. This can include photographs, videos, or computer-generated images that depict a minor in a sexually explicit manner.

How has CSAM distribution evolved?

In the past, child abuse images were distributed through direct exchanges among networks, illegal storefronts or via mail services. Today, advancements in technology mean that abuse imagery can be uploaded and shared worldwide within seconds.

The volume of CSAM online is increasing rapidly – discovering CSAM can take just three clicks.

The impact of CSAM

CSAM not only creates a permanent record of a child’s abuse for their abuser but also serves as material to fulfill the fantasies of collectors worldwide. Unfortunately, the exploitation doesn’t end there. Once distributed, these images can be weaponized to manipulate the child further, whether by obtaining more images, arranging a physical meeting, or extorting money. Additionally, these images can circulate widely, catering to those seeking specific fantasies. Predators might also use these images to groom other potential victims.

How perpetrators interact with CSAM

There are three main ways abusers interact with CSAM:

  1. Producing CSAM
    • Creating CSAM by capturing photos, videos, or audio recordings.
    • Producing textual or non-photographic visual material.
    • Manipulating existing material to generate new imagery.
  2. Searching for and viewing CSAM
    • Actively seeking CSAM on the internet.
    • Viewing or attempting to view this material.
  3. Sharing and storing CSAM
    • Sharing or storing CSAM, perpetuating the cycle of abuse.
    • Re-victimizing the abused by keeping the material in circulation.

Addressing the growing threat of CSAM requires concerted efforts from all sectors of society, including law enforcement, technology companies, and the public, to protect children and stop the spread of this harmful material.

Prevalence of CSAM

The number of reports of Child Sexual Abuse Material (CSAM) worldwide has been increasing significantly. In 2023, the US National Center for Missing & Exploited Children (NCMEC) received over 32 million reports related to online child sexual exploitation, including CSAM. Data from Childlight’s Into the Light Index suggests that one case of child sexual abuse online is reported every second.

This data reflects the growing prevalence and reporting of CSAM, driven by increased internet access, the widespread use of digital platforms, and enhanced efforts by tech companies and law enforcement to identify and report such content.

However, these figures likely represent only a fraction of the actual extent of CSAM due to underreporting and the hidden nature of these crimes. The full scale of child sexual abuse and exploitation online and the volume of resulting imagery is difficult to quantify, with estimates only scratching the surface of the material that has been discovered.

Who is affected?

The Internet Watch Foundation (IWF) reported that in 2021, 62% of CSAM assessed involved children aged 11-15 years, with a notable increase in self-generated content among teenagers. In January 2024, IWF reported that under tens are increasingly targeted by groomers, with a growing number of younger children being coerced into performing sexually online. The National Center for Missing & Exploited Children (NCMEC) has observed similar trends, highlighting the vulnerabilities of teenagers in online spaces. Over half of the identified child victims in widely circulated CSAM are prepubescent, and many are even too young to speak.

Impacts of CSAM

Children

Victims: physical and emotional harm

Survivors: Onoing trauma from abuse material online

Families

Parents and guardians: deep distress and helplessness

Siblings: emotional impact affecting family dynamics

Society  

Communities and schools: need awareness and support

Public Health systems: dealing with long-term mental health care

Law & legal systems

Law enforcement: increased reports, finding victims, catching perpetrators, stopping CSAM spread

Legal systems: handling complex cases, prosecuting offenders, protecting victims

Technology platforms

Tech companies: improving detection and removal of CSAM

Content moderators: facing psychological stress

An emerging threat: legal content of interest to predators

Our 2023 Global Threat Assessment noted ‘legal’ content of interest to predators as a new challenge in responding to CSAM. ‘Content of interest to predators’ (COITP) is content of children playing or exercising, or content innocently produced by children but consumed by predators for sexual gratification. Unlike child sexual abuse material content, COITP is not illegal.

Offender groups are using this to attempt to evade current platform policies and protections. Offender community groups curate this content for consumption, providing collections which can be shared and accessed by wider offender groups via social media services or other fringe services.

Curation of this type of content are strong signals for some form of intervention. They may in certain circumstances indicate wider child sexual abuse material interests as well.

In 2020, an AI ‘bot’ on Telegram generated 100,000 ‘deepfakes’ depicting real women and girls engaged in sexual acts, illustrating the scale of this issue.

Meta reported that over 90% of its reports to the National Center for Missing & Exploited Children (NCMEC) between October and November 2020 concerned shares or reshares of previously detected content.

In just one month of 2020, 8.8 million attempts to access CSAM were tracked by three of the Internet Watch Foundation’s member organizations.

The response:
Addressing the threat

Responding to CSAM requires a comprehensive approach involving various stakeholders.

Prevention and education

  • Educating children about online safety and the importance of privacy.
  • Maintaining open communication between parents and children.
  • Offering educational programs in schools and communities about CSAM dangers.

Laws and enforcement

  • Implementing strict laws against CSAM creation, distribution, and possession.
  • Collaborating globally to track and prosecute offenders.
  • Imposing severe penalties, including imprisonment and fines.

Support and resources

  • Providing victim support services, including counseling and legal aid.
  • Offering hotlines and online resources for victims and families.

Reporting mechanisms

  • Utilizing reporting features on platforms to quickly remove CSAM.
  • Key organizations like NCMEC and IWF play vital roles in processing reports.

Role of technology companies

  • Developing advanced tools like AI and machine learning to detect CSAM.
  • Collaborating with law enforcement to dismantle CSAM networks.
  • Implementing robust content moderation policies to prevent CSAM spread.

User responsibility

  • Encouraging users to report suspicious activities.
  • Educating users on privacy settings and account protection.

Global collaboration

  • Coordinating international efforts to create and enforce laws.
  • Collaborating between governments, NGOs, and tech companies.
  • Advocating for increased awareness and resources to combat CSAM globally.

What our members are doing

Page last updated on 24th November 2024