Self-generated sexual material
Understanding self-generated material
What is ‘self-generated’ sexual material?
Self-generated sexual material includes a wide range of images or videos created and shared by adolescents themselves. This can happen consensually between peers or under coercion, involving grooming, pressure, or manipulation.
While the term “self-generated” reflects current policy consensus, it is not universally understood or accepted. Some experts suggest using the term “image-based sexual exploitation and abuse of children” to avoid confusion with adult-produced abuse material. We refer to the phrase “self-generated” in quotation marks to avoid implying willingness on the part of the child or young person involved.
When does harm occur?
Self-generated sexual material isn’t inherently harmful. Adolescents might share such material as part of a normal developmental exchange. However, harm arises in situations such as:
- coercion into producing sexual material
- sharing material against someone’s wishes
- misusing non-sexual material for sexual purposes.
Using quotation marks around “self-generated” emphasizes that children may not be willing participants.
Economic factors and exploitation
There is evidence that some young people produce sexual imagery to escape poverty. In research conducted in Ghana, children cited financial motivation as a primary reason for creating and selling sexual material. Although the overall rate of economically motivated sexual exploitation is low, economic hardship, exacerbated by factors like the COVID-19 pandemic, suggests this trend might persist.
Additionally, inadequate sexuality and healthy relationship education globally leaves many children seeking information from unreliable sources like social media or pornography, affecting their understanding of sexuality and relationships.
Changing norms and increasing detection
Changing societal norms partially explain the rise in detected self-generated sexual material. Diverse motives, from voluntary to coerced production, make this a complex issue. Most studies, including the Alliance’s research, indicate that voluntary motives are common, while fewer instances involve threats, grooming or financial gain.
Barriers to seeking help
Research we conducted with Praesidio Safeguarding in Ghana, Ireland, and Thailand revealed that fear of legal repercussions prevents many children from seeking help when dealing with self-generated sexual imagery. Often, children refer to this content as a ‘pic’ or ‘selfie’, without acknowledging its explicit nature.
Reasons for sharing sexualized images
Children and adolescents share sexualized images for various reasons, including:
- experimentation and exploring their identity
- as part of a romantic relationship
- pressure from others
- online grooming.
In some instances, younger children, out of curiosity, might innocently share images without understanding the implications. More serious cases involve exploitation by adults or peers, where children are groomed, deceived, or extorted into producing and sharing further content. Understanding these dynamics is crucial for addressing the issue effectively and supporting affected children and adolescents.
How common is the sharing of sexual images and messages among young people?
The response: addressing the threat
Educational programmes
Addressing self-generated child sexual abuse material requires a multifaceted approach involving education, legal frameworks and robust support systems.
Educational programmes should emphasise the importance of consent, online safety, and the potential risks of sharing explicit content. These programmes must also encourage open communication between parents and children, helping young people understand the consequences of their online actions.
Schools and communities can play a pivotal role by providing comprehensive sexuality and healthy relationship education that reflects children’s lived experiences and equips them with the knowledge to navigate digital spaces safely.
Legal frameworks
Legal frameworks need to be robust and clear, targeting those who coerce, manipulate, or distribute self-generated child sexual abuse material while safeguarding the victims. Law enforcement agencies should be trained to handle such cases sensitively, ensuring that children feel safe to seek help without fear of criminalisation.
Technological solutions
Technological solutions, including advanced content moderation, age verification systems, and anonymous reporting mechanisms, are crucial in identifying, preventing, and removing self-generated material from online platforms.
Collaboration and public awareness
Collaboration among governments, tech companies, civil society and international organisations is essential to develop and enforce policies that protect children. Public awareness campaigns can help shift societal attitudes, making it clear that any form of child exploitation is unacceptable. Additionally, accessible mental health services and support networks are vital for helping affected children recover and rebuild their lives.
Page last updated on 24th November 2024