Big Tech companies enabling the circulation of child exploitative material could be held accountable under the "STOP CSAM Act," a bipartisan bill soon to be reintroduced by Sens. Josh Hawley (R-MO) and Dick Durbin (D-IL).
Hawley, chairman of the Senate Crime and Terrorism Subcommittee, and ranking member Durbin held a hearing Tuesday on advancing the STOP CSAM Act, which would crack down on the online proliferation of child pornography and allow victims to sue social media platforms that host this harmful material.
Survivor advocate Taylor Sines, a victim of online enticement and child sexual exploitation, testified about the challenges of getting CSAM removed from social media sites such as Facebook and Instagram. According to the witness, Facebook personnel told a then-teenage Sines that the hundreds of images and videos her abuser had posted of her would not be taken down since they did not violate their guidelines at the time. That was in April of 2016.
"While I no longer have content circulating the public web or public platforms, my images could continue being traded on the dark web," Sines said, further asking, "And what happens if they are posted publicly again?"
John Tanagho, executive director of the International Justice Mission's Center to End Online Sexual Exploitation of Children, raised concerns about companies failing to address the livestreaming of child sexual abuse.
Recommended
"Why would you ever look the other way when there's an active crime scene happening on your platform and through your services?" Tanagho questioned. "You have the technology to detect it in real-time."
Hawley, citing Tanagho, reflected on companies complaining frequently that "it's just too expensive" to deter. Yet, the senator noted, in 2023, Meta had $23 billion in profits, Google $60 billion in profits, Apple $97 billion in profits, and Amazon $27 billion in profits. "So don't tell me they don't have the wherewithal financially to disrupt, detect, and report," Hawley retorted.
John Pizzuro, retired commander of New Jersey State Police's Internet Crimes Against Children (ICAC) Task Force and now the CEO of Raven, a child advocacy organization, told the subcommittee that Big Tech platforms are only required to provide contact information under the current CyberTipline statute.
These platforms often possess a trove of information on a particular offender, Pizzuro said, yet "they simply choose not to report it" to the National Center for Missing and Exploited Children (NCMEC). This lack of coordination impedes the work of law enforcement, making it impossible to act on and launch an investigation.
"You've heard the statement: 'garbage in, garbage out.' In many instances, that is what we get," Pizzuro said. "Some providers even submit blank reports, which is all they currently need to do to comply with their legal obligations."
If passed, the STOP CSAM Act would compel electronic service providers to submit additional data in their reports to NCMEC, such as names, account identifiers, and IP addresses of the offenders, resulting in actionable investigations and eventual arrests.
"We can no longer trust Big Tech to submit what is needed when so many do the bare minimum required," Pizzuro declared.
Platforms are also currently only required to detect an "apparent" violation. The revised language would mandate that providers report "apparent, planned, and imminent" violations. Social media sites often detect grooming behavior or other signs of child exploitation, but they have total discretion to decide whether or not to report information about offenses about to occur, Pizzuro explained.
"This revised language will undoubtedly result in saving children from abuse before it happens," Pizzuro said. "As a former ICAC Commander, I can say that our best day is when we can rescue a child before they are abused."
First introduced in 2023, the STOP CSAM Act did not receive a vote on the Senate floor last Congress despite the legislation being unanimously advanced by the Senate Judiciary Committee.
As written in 2023, the comprehensive child protection measure contains multiple provisions related to removing Section 230 immunity for civil CSAM claims against tech companies; establishing a private right of action so that victims can file lawsuits for failure to remove CSAM in a timely fashion; forming a Child Online Protection Board within the Federal Trade Commission (FTC) responsible for enforcing the removal of CSAM and accordingly authorized to fine those that fail to comply; and stripping corporate discretion as to whether or not to report child sexual exploitation offenses.
In January 2024, the CEOs of Discord, Meta, Snapchat, TikTok, and X faced questioning from the Senate Judiciary Committee over how their sites have become hubs for child sexual exploitation and whether they should be held civilly liable for knowingly distributing CSAM.
Snapchat, which has long been used to send sexually explicit content, began as a sexting app where photographs disappear within seconds by design. The popular application has become a go-to tool for sexual predators to target children.
In one high-profile case, an online predator groomed a 12-year-old girl over Snapchat and coerced her to send him sexually explicit content, knowing "the chats [would] go away." When the child, who went by L.W. in court, sued Snapchat, her case was dismissed under Section 230 of the Communications Decency Act.