Outcomes briefing – technology, privacy and rights roundtable

pexels monstera 5876447

ECPAT International, WeProtect Global Alliance – Briefing

May 12, 2021

This briefing records the outcomes from the expert roundtable meeting: technology, privacy and rights: keeping children safe from child sexual exploitation and abuse online

Context

  1. This roundtable, in partnership with ECPAT International, was the first in a new series of gatherings hosted by WePROTECT Global Alliance. It emerged in response to the controversy surrounding the European Commission’s proposal for a temporary derogation to the e-Privacy Directive and the Electronic Communications Code. This has shone a spotlight on the challenges inherent in balancing privacy with child protection, as well as the need for consensus on the proportionate use of innovative technology by private companies to proactively identify children at risk of or experiencing exploitation and abuse.
  2. This debate has also highlighted the need for careful consideration of proposals for long-term legislation and structures that ensure a robust system for preventing and responding to child sexual exploitation and abuse (CSEA) in digital environments, including through regulation of digital service providers, without undermining fundamental rights.
  3. A group of experts representing various sectors, including data protection, privacy, AI and technology, child rights and victim support, were invited to explore the legal basis for use of tools to detect CSEA online, and discuss the privacy and child safety implications of these tools from different perspectives.
  4. The primary objectives were to identify common ground and to identify solutions that sufficiently balance the rights of all users of the internet and the specific rights of children, in particular victims of CSEA online.
  5. Three key discussion questions guided the debate:
    1. Does existing legislation in Europe (and the US) provide a detection tools?
    2. What can we learn from the implementation of existing cybersecurity tools?
    3. Is there identifiable common ground between privacy and child protection advocates?
  6. Discussion was also built on the premise that arguments need to be evidence-based in terms of legislation and the functionality of the tools both specifically and generically. Therefore, the discussion was grounded with an opinion on the legal basis in the GDPR for the use of technology to detect known child sexual abuse material (CSAM).
  7. While acknowledging that legal opinions can be made to support different sides of this debate, there are strong arguments to support the position that service providers can base the processing of personal data in the context of detecting and reporting CSAM on either a task carried out in the public interest (Art. 6.1 (e) GDPR) or on legitimate interest (Art. 6.1 (f) GDPR). The former legal basis requires a provision in Union or Member State law in which this public task is set forth or can at least be based on. The national transposition of Article 16.2 CSA Directive may provide such a provision. That same provision can then be used to invoke reasons of substantial public interest (Art. 9.2 (g) GDPR) to obtain an exemption from the prohibition of processing special categories of personal data.
  8. Because the ePrivacy Directive prevails over the GDPR with regards the processing of confidential information by providers of Number Independent- Interpersonal Communications Services (NI-ICS), a derogation enables service providers that offer a range of online services to use a more holistic, cross- service approach to ensure compliance with data protection law while taking efficient action against CSEA online.
  9. This addresses in part to the opinion of the European Data Protection Supervisor on the interim derogation, that: “Confidentiality of communications is a cornerstone of the fundamental rights to respect for private and family life. Even voluntary measures by private companies constitute an interference with these rights when the measures involve the monitoring and analysis of the content of communications and processing of personal data. In order to satisfy the requirement of proportionality, the legislation must lay down clear and precise rules governing the scope and application of the measures in question and imposing minimum safeguards, so that the persons whose personal data is affected have sufficient guarantees that data will be effectively protected against the risk of abuse.”

Did you find this content helpful?
YesNo
Did you find this content helpful?
YesNo