AI-produced child sexual abuse material: Insights from Dark Web forum discussions 

Black Laptop Computer Turned on Showing Computer Codes

By Dr Deanna Davy, Senior Research Fellow with the International Policing and Public Protection Research Institute. 

September 2024

Newspaper headlines and social media feeds are abuzz with stories about the potential positive effects of Artificial Intelligence (AI) on society. But there is scant attention to the detrimental effects of AI on society. A core area for concern, which requires much more scholarly attention, awareness-raising, and deterrence measures is the use of AI to create child sexual abuse material (CSAM).  

Agencies such as WeProtect Global Alliance and  the Internet Watch Foundation (IWF) sounded the alarm regarding AI CSAM in 2023, highlighting it as an area of concern for governments, civil society organizations, private sector agencies, and parents and children. IWF’s 2023 research found that offenders are taking images of children, often famous children, and applying deep learning models to create AI CSAM. 

There are currently two major categories of AI CSAM: (1) AI-manipulated CSAM, where images and videos of real children are altered into sexually explicit content, and (2) AI-generated CSAM in which entirely new sexual images of children are manufactured (Krishna et al., 2024). IWF reported in 2023 that both types of AI CSAM are prevalent.  

Researchers at the International Policing and Public Protection Research Institute (IPPPRI) wanted to understand more about what Dark Web forum members were saying and doing with regards to AI CSAM. To do this we decided to examine Dark Web forum members’ posts and discussions about AI CSAM. We collected this data through Voyager, an open-source intelligence (OSINT) platform that collects, stores and structures content from publicly accessible online sources including Dark Web forums where CSAM related discussions take place. Data collection was conducted in January 2024 using a key word search in the ‘Atlas’ dataset of Dark Web child sexual exploitation (CSE) forums in Voyager. Our search using the search terms “AI” OR “Artificial intelligence”, and the search date range 2023 (12 months), resulted in 19,951 results (this included 9,675 hyperlinks; 9,238 posts; 1,021 threads; and 17 forum profiles). We then looked at a sample in order to conduct some preliminary analysis of what forum members were saying and doing regarding AI CSAM. 

What we discovered is very worrying. First, there is a real appreciation and appetite for AI CSAM. Forum members refer to those who are creating AI CSAM as ‘artists’. What forum members appreciate is that those creating AI CSAM can take an image of, for example, a favourite child film character, and create a plethora of AI CSAM of that child. At present, forum members are particularly interested in AI CSAM of famous children such as child actors and child sports stars.  

We discovered that forum members who are creating AI CSAM are not IT, AI or machine learning experts. They are teaching themselves how to create AI CSAM. They easily access online tutorials and manuals on how to create AI CSAM; these resources are widely shared in Dark Web forums. They then reach out to other Dark Web forum members who already have experience in creating AI CSAM to ask questions about how to ‘train’ the software and overcome challenges that they are experiencing (such as the child in the image having too many limbs or digits). As part of this effort to improve their AI production skills, forum members actively request others to share CSAM so that they can use this material for ‘practice’.  

We also found evidence that, as forum members develop their skills in the production of AI CSAM, they actively encourage other forum members to do the same. This is of particular concern as it can feed demand and lead to a continual upskilling loop whereby as more forum members view AI CSAM and become interested in creating AI CSAM themselves, they then hone their AI skills, share their AI produced CSAM, and encourage others to create and share AI CSAM.  

We also discovered that some forum members are already moving from the creation of what they describe as ‘softcore’ AI CSAM to more ‘hardcore’ material. Driving this pattern may be the normalisation and desensitisation of material and the search for more explicit and violent material.  

It was also clear that forum members are hopeful that AI will continue to quickly develop so that in the near future they won’t be able to tell if a sexual image of a child is real or not. They’re also hopeful that AI will develop to a point that they can create increasingly hardcore and interactive material (such as interactive videos where they can instruct a video character to perform sexual acts).  

On the very day we published these findings, a man was convicted, in a landmark UK case, for the creation of more than 1,000 sexual images of children using AI.  Our analysis of discussions on AI CSAM on the Dark Web suggests that convicted individual is just one of many committing such crime.  

This is not a niche area – on the contrary the creation of AI CSAM is heading towards the mainstream. That’s why we need a rapid and unwavering response. The cat is already out of the bag, so to speak, with regards to AI CSAM. Offenders are adopting this tool into their toolkit.  

Our task now is to limit the expansion of the phenomenon through legal reform, robust deterrence measures, as well as further evidence generation, and awareness-raising of the issue.  

Dr Deanna Davy is a Dawes Senior Research Fellow with the International Policing and Public Protection Research Institute.  Deanna has worked in research on trafficking and child sexual exploitation for a number of government agencies, international organisations, and non-government organisations, including the United Nations Office on Drugs and Crime, the International Organization for Migration, the United Nations Children’s Fund, and ECPAT International. Prior to joining the IPPPRI team, Deanna was employed as a Research Fellow (modern slavery) at the University of Nottingham. 

Did you find this content helpful?
YesNo