Letter to Senate Judiciary Committee


On January 11 we sent the Senate Judiciary Committee a letter requesting that Mark Zuckerberg be asked six questions from survivors during the January 2024 Hearing with Five Big Tech CEOs on their failure to protect children online. This letter was made public on January 24, 2024.


Phoenix 11

January 11, 2024

The Honorable Dick Durbin, Chair
Senate Judiciary Committee
711 Hart Senate Building
Washington, D.C. 20510

The Honorable Lindsey Graham, Ranking member
Senate Judiciary Committee
211 Russell Senate Office Building
Washington, D.C. 20510

Dear Chairman Durbin and Ranking Member Graham,

The Phoenix 11 commends the Senate Judiciary Committee on requiring the attendance of five of Big Tech’s most prominent CEOs to testify at your hearing on Online Child Sexual Exploitation. We are a group of survivors whose child sexual abuse was recorded, and in most cases, is known to be distributed online. We appreciate all efforts the Senate Judiciary Committee has made to engage the voices of survivors in these critical discussions. We write to you today to respectfully request that the following questions are asked of Meta’s CEO, Mark Zuckerberg, while under oath. We further ask that our questions be submitted for the record.

As survivors, we bear the consequences when decisions are made that prioritize profit over children. We were raped and tortured while being photographed and filmed. We had no way of knowing as children that our perpetrators would one day come to include internet platforms that serve to facilitate the sharing, uploading, and downloading of our most horrendous moments. We had no control over what was done to us as children. If Meta no longer reports these crimes against us, we alone suffer the consequences.

We grew up being told by our abusers that abuse is inevitable. Meta is telling us the same through their December 6 decision to fully implement end-to-end encryption (E2EE); we need them to take responsibility for that. When Meta announced their choice, many of us spent the following day in the offices of our therapists because without Meta being willing to be a part of the solution, our abuse happens over and over again on its platforms.

Mark Zuckerberg has refused to address concerns expressed by survivors, as well as global experts, regarding Meta’s roll out of default E2EE on its messenger platforms. Meta has created a safe haven for child predators. First, Meta allows users to opt in to E2EE in group chats. This affords opportunity for child predators to trade and request child sexual abuse material and enables perpetrators the ability to exchange “best practices” on how to sexually abuse and exploit children. Offenders use imagery of other children being abused to groom victims and tell them this is what kids and adults do. We know this because it happened to us and we are sickened every time we think about our abuse material being used in this way to hurt other children. Today children can be groomed, abused, and exploited by predators in group chats on Meta’s platform with a simple opt in for encryption.

Now, not only will Meta be deploying default E2EE across all of their messenger platforms, it is also offering users the ability to send disappearing messages. It is not enough that Meta is going to mask the actual crime scenes of the sexual abuse of children, they are also going to help predators delete evidence. In masking crime scenes and allowing predators to delete evidence, Meta will be helping to facilitate an unparalleled demand for images and videos of the rapes of babies and children.

The questions we respectfully submit to the Senate Judiciary Committee for Mark Zuckerberg, CEO and Chairman of Meta, are as follows:

  1. Meta’s own consulting firm, Business for Social Responsibility (BSR), generated a Human Rights Impact Assessment in 2022. Not only did BSR advise risk mitigations if Meta were to fully implement E2EE, but BSR also had access to internal estimates from Meta’s risk assessment modelling as it relates to that decision. What is Meta’s internal estimate of the reduction in child sexual abuse material reports with the roll out of default E2EE? What number of children sexually abused and exploited on Meta’s platforms is an acceptable number for Meta?
  2. The National Center for Missing & Exploited Children (NCMEC) warns that 70% of reports they receive from Meta each year could be lost with implementation of E2EE in the absence of appropriate risk mitigations. The United Kingdom’s National Crime Agency modelling reflects that as many as 92% of messenger reports from Facebook and 85% from Instagram will be lost to law enforcement. Does Meta contest that reports of child sexual abuse and exploitation will drastically decline on their messenger platforms? If so, by which metrics?
  3. Meta has stated that they are introducing default E2EE to protect privacy. What is Meta’s plan to prioritize the privacy of children and survivors whose child sexual abuse material lives on their platforms, exposing the worst moments of their lives to strangers every day?
  4. What child safety organizations, that have not received financial support from Meta, support Meta’s E2EE roll out on messenger platforms?
  5. Did Meta consult their Safety Advisory Council prior to this decision? When was this consultation done, and what were their responses collectively and individually?
  6. Does Meta have measures in place to protect consumers’ messaging content from malware or viruses as well as possible fraudulent URL links? If so, can Meta explain how this software differs from the client-side scanning proposed by numerous child safety organizations using PhotoDNA pioneered by Dr. Hany Farid? What research and data is such an analysis based on? What research and data has Meta relied upon to prioritize malware and virus protection while ignoring the detection and removal of child sexual abuse material?
“There are real safety concerns to address before we can implement end-to-end encryption across all of our messaging services. Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion. We have a responsibility to work with law enforcement and to help prevent these wherever we can.”
— Mark Zuckerberg, March 6, 2019

To say that we are devastated over Meta’s decision would be an understatement. Mr. Zuckerberg clearly had his doubts regarding implementing E2EE without necessary safeguards to mitigate child exploitation on his platforms in 2019. The research and data that we have seen from everyone except Meta affirms that E2EE will remove a crucial tool needed to protect children. The Phoenix 11 stands in solidarity with organizations in various fields of expertise who are advocating for the best interests of our society’s most vulnerable. However, we do not wish for our voices as survivors of these crimes to become lost in the crossfire between Big Tech and these organizations, or between privacy for law-abiding adults and for safety for children — which we know can co-exist. Survivor’s voices remain the least heard in discussions and decisions regarding horrific sexual abuse. Yet, we continue to be exploited because of imagery that lives on platforms such as Meta’s. Mark Zuckerberg must be held accountable for his decisions. We know too well what the future holds for future generations if we do not act with urgency and demand transparency.

The Phoenix 11 continues to be grateful to the Senate Judiciary Committee for considering our voices, and we are eager to be included in this ongoing conversation. We thank you for your generous attention, and for your pursuit of critical answers in the fight against online child sexual exploitation and abuse.

Sincerely,

Phoenix 11