Apple’s fresh approach to child protection can be a game-changer
Chloe Setter, Head of Policy for WeProtect Global Alliance, looks at Apple’s newly announced plans to boost its child protection efforts and explains why it feels like a pivotal moment in child safety.
Apple, a long-standing member of our Alliance, has announced major steps to prevent child exploitation and abuse on its devices and services.
This news is of particular significance because Apple’s planned approach will allow for the detection of child sexual abuse material (CSAM) in an encrypted environment – something that has posed a challenge to child protection actors for some time.
Data encryption is used widely across the world, providing greater security and privacy for internet users. But for all of its benefits to adults and children alike, encryption also poses a challenge to those seeking to protect children, in particular those trying to detect and remove child sexual abuse content online and prevent grooming – and to law enforcement attempting to prosecute offenders.
The company’s newly announced approach is significant because Apple has adopted device-level safety measures (alongside server-side technology) – a key recommendation from a recent WeProtect Global Alliance expert roundtable that explored solutions for detecting child sexual exploitation and abuse whilst balancing internet users’ right to privacy.
Apple’s expanded protections
The company plans to introduce new child safety features in the United States across three key areas:
- New communication tools will warn children and parents when receiving or sending sexually explicit photos. Apple’s iMessages service will use on-device machine learning to analyse image attachments and determine if a photo is sexually explicit (so Apple itself does not get access to the messages thus preserving end-to-end encryption in iMessages).
- To improve detection of CSAM, the iOS and iPadOS will use a new tool, NeuralHash, to allow Apple to detect known CSAM images stored in iCloud Photos. According to Apple, this detection will help the company provide information to the National Center for Missing and Exploited Children (NCMEC), which acts as a reporting centre for CSAM and works in collaboration with law enforcement agencies.
- Finally, updates to its Siri and Search functions will give parents and children additional information and help if they encounter “unsafe situations”. Additionally, Siri and Search will “intervene when users try to search for CSAM-related topics”.
Why is this important for child safety?
Although concerns have been raised by some that these new technologies may pose a threat to privacy, the company maintains that it is able to utilise the technologies in a way that preserves user privacy. This is critical as privacy is a fundamental right for all individuals using the internet, including victims and survivors of sexual abuse who face the re-traumatisation of having their abuse shared online.
Balancing privacy and child safety in an increasingly encrypted online world is not simple – and so it is hugely encouraging to see Apple recognise and respond to the challenge, including through the use of device-level safety measures.
As an Alliance, we’ll be promoting the learning from this development among our 200+ member organisations, companies and governments, plus encouraging others to explore and invest in solutions that work for their own platforms, devices and services – many of which have been discussed as part of an interesting thread on our new Protectors’ Portal for members.
We’re yet to see the impact of the tools. But we do know that, across the board, more can be done in particular to allow detection of previously unseen CSAM and grooming attempts – and to ensure tools have a wide geographical reach and uptake.
However, with this latest announcement Apple is making a bold statement: child protection matters.
It’s a powerful message and one that needs to be heard across the world. With more than 21.7 million reports of suspected CSAM made to NCMEC in 2020 alone, finding solutions to create a digital world that is safer for children should be a top priority for everyone.
The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of WeProtect Global Alliance or any of its members.
Page last updated on 14th November 2024