Post-Assad Syrian Christians Rise Up to Celebrate Christmas
The Details Are in on How the Feds Are Blowing Your Tax Dollars
Here's the Final Tally on How Much Money Trump Raised for Hurricane Victims
Here's the Latest on That University of Oregon Employee Who Said Trump Supporters...
Watch an Eagles Fan 'Crash' a New York Giants Fan's Event...and the Reaction...
A Second US Navy Fighter Almost Got Shot Out of the Sky
Not Quite As Crusty As Biden Yet
Poll Shows Americans Are Hopeful For 2025, and the Reason Why Might Make...
Legal Group Puts Sanctuary Jurisdictions on Notice Ahead of Trump's Mass Deportation Opera...
Here's Why Texas AG Ken Paxton Sued the NCAA
Of Course NYT Mocks the Virgin Mary
What Is With Jill Biden's White House Christmas Decorations?
Jesus Fulfilled Amazing Prophecies
Meet the Worst of the Worst Biden Just Spared From Execution
Celebrating the Miracle of Light
Tipsheet

Second Victim Joins Lawsuit Accusing Twitter of Allowing Sex Trafficking

AP Photo/Matt Rourke

Last week, a second accuser joined legal action against Twitter alleging the tech giant illegally profited from the sharing of sexually trafficked content.

Filed in January, the original lawsuit was brought by a single accuser, known as John Doe #1, and alleged that sexually trafficked content of himself as a young teenager was allowed to be monetized and shared across twitter, with the platform further denying requests to have the explicit images taken down. 

Advertisement

As of last week, a second alleged victim, referred to as John Doe #2, has joined the federal lawsuit alleging he was likewise harmed by the irresponsible actions of Twitter. 

“The National Center on Sexual Exploitation Law Center (NCOSE), The Haba Law Firm, and The Matiasic Firm are suing Twitter on behalf of a second survivor of child sexual abuse who was trafficked on the social media platform,” said a statement released by NCOSC. 

“Both plaintiffs,” the statement continues, ”were harmed by Twitter’s distribution of material depicting their sexual abuse and trafficking, and by Twitter’s knowing refusal to remove the images of their sexual abuse (child pornography) when notified.”

John Doe #1, who is now 17, says he was no older than 14 when he first exchanged sexually graphic images over Snapchat with someone he believed to be a teenage girl. In reality, the supposed girl was an adult sex trafficker, and shortly thereafter Doe #1 found himself being blackmailed into sending more explicit content. 

Doe #1 complied with the trafficker’s demands at first, but ultimately, he blocked the blackmailers and hoped to put the traumatic incident behind him. However, this was not the case as in 2019 some of the blackmailed videos found their way to Twitter. As the videos spread online, they were only brought to Doe #1’s attention when his classmates began to tease and bully him.

Advertisement

Doe #1 and his family reported the child pornography to Twitter on multiple occasions, but ultimately, the company refused to take down the explicit posts until law enforcement got involved. 

In fact, according to the lawsuit, Twitter’s original response to Doe #1’s report that underage and explicit videos of himself were being housed on their platform was stunningly underwhelming. 

“Thanks for reaching out,” wrote a Twitter support agent, “We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” This response might very well be laughable if not for its horrid practical consequence of allowing child pornography to proliferate across Twitter. 

For their part, Twitter has argued that they did not willfully neglect the requests to remove the explicit content, but instead, they say it simply takes time for such problems to be addressed in a platform as big as Twitter. 

A Twitter spokesperson clarified their position in earlier reporting, expressing, “Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy.” 

Advertisement

Going forward, the legal battle will likely focus on section 230 of the Communications Decency Act, which grants social media companies immunity against prosecution for content posted to their respective platforms. 

Twitter has filed a motion to dismiss the case based on section 230 and the court will hear the case beginning on June 4. 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement