Turning principles into action
Following an industry-only webinar for tech members of the Alliance, Liz Thomas (Regional Digital Safety Lead in Asia-Pacific for Microsoft) gives a brief overview of some of the key questions and themes that arose at the event.
A year has now passed since the launch of the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse. Addressing this issue remains as critical as ever: unfortunately the US National Center for Missing and Exploited Children received its highest ever number of reports over the last year. Reversing this trend will require a multi-stakeholder effort, including efforts by government, technology companies, and civil society.
The Voluntary Principles aim to provide a flexible high-level framework for technology companies to combat online child sexual exploitation and abuse.
To assist members of the technology industry in considering how they might operationalize the Voluntary Principles, the six companies that initially supported the Principles (Facebook, Google, Microsoft, Roblox, Snap and Twitter) have developed A guide for tech companies considering supporting the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse.
This guide is available on the WePROTECT Global Alliance website.
To mark the launch of the Guide, these six companies partnered with the Alliance to host an industry-only “Q&A” session on operationalizing the Voluntary Principles on 25 February 2021. Held under the Chatham House Rule, the session provided a forum for in-depth conversation on considerations for companies of all sizes when thinking about how they might implement the Principles. Policy, legal, and operational experts from the six companies were on hand to answer questions and share information about existing tools and resources.
The session included an overview of the new Guide, and joint presentation from the six companies, focusing on three of the 11 Voluntary Principles, specifically:
- Principle 2, on preventing the dissemination of new child sexual abuse material (CSAM)
- Principle 3, on combating grooming, and
- Principle 6, on search.
Following an overview of the guide from Microsoft, experts from Facebook, Roblox, and Google shared insights on how their respective companies approach these issues. In addition, Twitter shared reflections from a mid-sized company.
Emerging themes and practical advice
The Q&A segment took center stage, with questions about the anti-grooming tool, staying ahead of the evolving threat, and sharing hashes and other signals garnering significant attention. Other discussion topics included resources for new companies in building their moderation functions, supporting staff who deal with CSAM, and managing accounts and other consequences for perpetrators.
The wide-ranging conversation highlighted the value of cross-industry collaboration and learning from others, as well as the need for flexibility, given the unique challenges each platform faces.
’Flash polls’ conducted at the start and conclusion of the event showed that about a quarter of participants (24%) had limited or no knowledge of the Principles prior to the webinar, and 95% said they had come away with new information, tools or resources to leverage in their CSAM content moderation efforts. In addition, more than three-quarters (76%) of participants said it was “quite likely” or “very likely” that their companies would consider operationalizing the Voluntary Principles over the next 12 months.
On behalf of the six companies, thank you to WePROTECT Global Alliance for their partnership in holding the Q&A session and in hosting the Guide, and to all session attendees for their interest and engagement!
The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of WeProtect Global Alliance or any of its members.