After three members of Twitter's so-called Trust & Safety Council resigned and claimed council members had taken great care in "protecting users on the platform," Elon Musk fired back, tweeting in response: "It is a crime that they refused to take action on child exploitation for years!"
When former CEO Jack Dorsey responded by saying Musk's claim was "false," he shot back: "When Ella Irwin, who now runs Trust & Safety, joined Twitter earlier this year, almost no one was working on child safety."
Musk added that Irwin, the trust and safety chief since Nov. 18, previously raised her concerns with former CEO Parag Agrawal and ex-chief financial officer Ned Segal, but according to Musk, "they rejected her staffing request."
Musk concluded that he's made it his "top priority," before adding that Segal had "super messed up priorities."
Three members of Twitter's Trust & Safety Council resigned on Thursday. The departing members are Anne Collier, founder, and executive director of The Net Safety Collaborative; Eirliani Abdul Rahman, co-founder of Youth, Adult Survivors & Kin In Need; and Lesley Podesta, an adviser to the Young and Resilient Research Center at Western Sydney University.
In their resignation, the three said: "The establishment of the Council represented Twitter's commitment to move away from a US-centric approach to user safety," and stressed the globalist approach before Musk's takeover.
They claimed that Musk's reliance on "automated content moderation" will create more "abuse and hate speech" on the platform.
The statistics they cited were from the debunked Anti-Defamation League and the Center for Countering Digital Hate.
Three of us resigned from Twitter’s Trust & Safety Council today: @eirliani @podesta_lesley and me. Here’s why https://t.co/h05TblfGIO pic.twitter.com/iqcHvhbgms— annecollier (@annecollier) December 8, 2022
According to a lawsuit filed in January 2021, Twitter had knowingly hosted individuals who use the platform to exchange child porn material and Twitter profited from it by including ads interspersed between tweets advertising or requesting the material.
The federal lawsuit stated that Twitter refused to take down widely shared pornographic images and videos of a teenage sex trafficking victim because an investigation “didn’t find a violation” of the company’s “policies.”
The lawsuit, filed by the victim and his mother in the Northern District of California, alleges Twitter made money off the clips, which showed a 13-year-old engaged in sex acts and are a form of child sexual abuse material, or child porn.
At some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material, court papers allege.
The victim said he filed a complaint with Twitter, saying there were two tweets depicting child pornography of himself and they needed to be removed because they were illegal, harmful and were in violation of the site’s policies.
On Jan. 28, 2021, Twitter said it wouldn’t be taking down the material, which had already racked up over 167,000 views and 2,223 retweets, the suit states.
“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” the response reads, according to the lawsuit.
In his response, published in the complaint, the victim states: "What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down."
The videos were not removed until Jan 30, 2021, when Twitter was contacted by the Department of Homeland Security.
Action . . . . Intelligence . . . . Publish