STOP CSAM Act leaves loophole for tech: Statement from the Phoenix 11 to U.S. Senators
On March 12 we sent a statement listing our concerns about the STOP CSAM Act to the offices of key U.S. Senators. This statement was made public on March 14.
We, the Phoenix 11, are very concerned about how Section 2255A of the STOP CSAM bill can be used as a loophole for tech companies to avoid taking accountability and action to remove CSAM from their online platforms. We want to hold everyone accountable who is part of the process of creating, distributing, hosting, facilitating the transfer of, and viewing the CSAM. Each aspect of the sharing of CSAM contributes to the harm toward survivors. When CSAM images are shared, hosted, or transferred across online platforms, it contributes to the digital rape and digital sex trafficking of CSAM survivors. It contributes to their continued victimization and makes it harder to heal and feel whole.
The first generation of online CSAM survivors are now in their 30s and early 40s, and yet, still, their CSAM images are circulating online, and they continue to be harmed by it. We are part of that generation, and we have seen the problem grow rather than get smaller. There are now so many more ways for children to be harmed online and for CSAM to be shared without safeguards in place to stop it. Technology changes so fast that it is difficult for legislation to keep up with it. There are now three generations of children’s sexual abuse images online, and we are grieved by how little has changed in the online landscape to prevent and to combat the issue. It should not take three generations of a growing number of victims to motivate companies and policymakers to make much-needed changes.
When survivors’ CSAM images are circulated online, their human rights are violated. Images of their bodies are used by perpetrators in illegal ways. The loss of bodily autonomy and control over others using them for sexual purposes creates intense distress and ongoing trauma for survivors. Tools are available for tech companies to screen for and remove CSAM if they choose to utilize them.
The Phoenix 11 believes that tech companies need to be responsible for detecting and removing CSAM on their platforms rather than allowing options for it to be hosted and shared without consequence. We do not support any loopholes for tech companies to avoid responsibility for the CSAM that is facilitated by their platforms. These kinds of loopholes reward pedophilic behavior, aid the further distribution of CSAM images, and contribute to the growth of the issue.
There is so much opportunity for technology to be used for good in the fight against CSAM. There are already good tools available to flag and remove CSAM, and there is room for the creation of more tools to detect and remove CSAM across digital platforms without the need for humans to have to review the images. We ask for legislation that motivates tech companies to use and create these kinds of tools rather than legislation that removes the responsibility from them or even allows them to profit off of users sharing CSAM. We would hope that tech companies would strive to be known for the protection of children and victims rather than the facilitation of further victimization. We will continue to hold tech companies to this standard, and we ask that policymakers do as well.
Sincerely,
Phoenix 11