A week ahead of the Senate Judiciary Committee Hearings with Big Tech CEOs in Washington, the Canadian Centre for Child Protection (C3P) is standing with child sexual abuse material (CSAM) survivor advocates. The Phoenix 11, a powerful group of survivors, are asking members of the Senate Committee to question Mark Zuckerberg on the record about the harm that children will endure because of Meta’s December 6, 2023 decision to fully implement end-to-end encryption (E2EE) on Messenger and Instagram.
For six years, this powerful group of CSAM survivors have been sharing their voice to help the world understand the traumatic, lifelong impacts of this crime. CSAM is spread on tech platforms every minute of the day, including Meta’s, meaning a survivor’s abuse never ends. The Phoenix 11 have sent a letter to the members of the Senate Judiciary Committee asking that Mark Zuckerberg be questioned about Meta’s decision to move to E2EE without implementing proper safeguards to detect the sharing of known CSAM.
“If Mark Zuckerberg is going to stand by his business decision that prioritizes profit over children and survivors, decline accountability for the egregious consequences, and refuse to address concerns raised by survivors and child safety experts, we will not let him do so easily. We will not be silenced,” said the Phoenix 11. “We were raped and tortured while being photographed and filmed. We had no way of knowing as children that our perpetrators would one day come to include internet platforms that serve to facilitate the sharing, uploading, and downloading of our most horrendous moments. We grew up being told by our abusers that abuse is inevitable. Meta is telling us the same through their decision to fully implement E2EE; we need them to take responsibility for that.”
Meta’s choice to fully implement E2EE means the millions of reports that drive global law enforcement responses against child sexual abuse and exploitation will soon cease to exist. According to the National Center for Missing and Exploited Children, “in 2022 alone, Meta reported more than 20 million incidents of offenders sharing these unimaginable images via Facebook and Messenger.”1
C3P and the Phoenix 11 are also launching a website today that documents the survivors’ years of advocacy work, along with their powerful advocacy statement, alongside resources and messages for survivors.
“Survivors shouldn’t have to call for accountability of the tech companies who help facilitate the ongoing crimes and harms against them, but for years this brave group of women have had to do exactly that,” said Lianna McDonald, C3P Executive Director. “From Australia to the White House, the Phoenix 11 have been giving a voice to CSAM victims who have, for so long, been silenced and shut out of the conversations that impact their daily lives, their rights and their safety.”
To view the Phoenix 11’s full letter, please visit thephoenix11.com/senate2024.
Questions submitted to the Senate Judiciary Committee for Mark Zuckerberg:
- Meta’s own consulting firm, Business for Social Responsibility (BSR), generated a Human Rights Impact Assessment in 2022. Not only did BSR advise risk mitigations if Meta were to fully implement E2EE, but BSR also had access to internal estimates from Meta’s risk assessment modelling as it relates to that decision. What is Meta’s internal estimate of the reduction in child sexual abuse material reports with the roll out of default E2EE? What number of children sexually abused and exploited on Meta’s platforms is an acceptable number for Meta?
- The National Center for Missing & Exploited Children (NCMEC) warns that 70% of reports they receive from Meta each year could be lost with implementation of E2EE in the absence of appropriate risk mitigations. The United Kingdom’s National Crime Agency modelling reflects that as many as 92% of messenger reports from Facebook and 85% from Instagram will be lost to law enforcement. Does Meta contest that reports of child sexual abuse and exploitation will drastically decline on their messenger platforms? If so, by which metrics?
- Meta has stated that they are introducing default E2EE to protect privacy. What is Meta’s plan to prioritize the privacy of children and survivors whose child sexual abuse material lives on their platforms, exposing the worst moments of their lives to strangers every day?
- What child safety organizations, that have not received financial support from Meta, support Meta’s E2EE roll out on messenger platforms?
- Did Meta consult their Safety Advisory Council prior to this decision? When was this consultation done, and what were their responses collectively and individually?
- Does Meta have measures in place to protect consumers’ messaging content from malware or viruses as well as possible fraudulent URL links? If so, can Meta explain how this software differs from the client-side scanning proposed by numerous child safety organizations using PhotoDNA pioneered by Dr. Hany Farid? What research and data is such an analysis based on? What research and data has Meta relied upon to prioritize malware and virus protection while ignoring the detection and removal of child sexual abuse material?