Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Local Newscast
Hear the latest from the WRKF/WWNO Newsroom.

Former members of Twitter's safety council voice concerns over Musk's acquisition

RACHEL MARTIN, HOST:

The turmoil continues at Twitter after billionaire Elon Musk acquired the social media company. Three members of Twitter's Trust and Safety Council resigned last week, saying that Musk has undermined the platform's ability to protect its users. In a resignation letter, the former council members say Musk is embracing automated content moderation that is relying more on algorithms rather than humans to shield users from hate speech and other threats. Two of those former council members are with me now. Eirliani Abdul Rahman and Anne Collier, thanks to you both for being here.

EIRLIANI ABDUL RAHMAN: Pleasure.

ANNE COLLIER: Good to be here.

MARTIN: May I start with you, Eirliani? Can you explain what the Trust and Safety Council does?

RAHMAN: Yeah. We are a body that was invited by Twitter to join and make sure that the conversations that's fostered on the platform is healthy.

MARTIN: Why did you not see this council as being an effective way to moderate content on Twitter?

RAHMAN: I think, for me, the - I mean, I've also talked about this elsewhere. It was really watching the whole negotiation process take place when Elon Musk was negotiating the purchase. And at that time, I wrote down some commitments to myself that I would resign should he cross the threshold. And cross the red lines he did. You know, we've seen data from Anti-Defamation League and the Center for Countering Digital Hate. The numbers are actually terrifying. So in terms of average number of tweets per day, in the first two weeks, antisemitic posts went up by 61% - against gay men, the corresponding number is 58%. I find that highly unacceptable.

MARTIN: And all this data you're quoting is in the time period since Musk took over?

RAHMAN: Completely correct. And these are the data that my fellow former peers put forward. And they were part - they are part of the council and still are in there. And the other red line that he crossed was when previously banned accounts were reinstated - so for example, the ones that led up to what happened January 6 here in the U.S. So for me, all these were highly, highly problematic. We were hoping that our - with our resignations, it would prompt a rethinking within Twitter, within the council, but also just generally within Twitter headquarters to reconsider what's happening to the content moderation and to make it a safer space for the public.

MARTIN: Anne, in your resignation letter, you said Twitter is moving toward automated content moderation. Why is that risky in your view?

COLLIER: Content moderation is very complex and highly nuanced. It's also very contextual. It's very, very hard for algorithms to determine what truly is harmful without any context whatsoever. Human beings are needed to do that. And we know that Twitter staff is massively reduced. And Twitter has to be reliant on automated content moderation more. So - and that's an announcement directly from Twitter itself.

MARTIN: I mean, do you believe he intends to disband the Trust and Safety Council and replace it with something else, his own appointees?

COLLIER: We don't know. But it's quite possible because if he didn't form us, then perhaps he wants to be the one to form a new group. It's just really hard to know. You know, we're really seeing progress in the industry. There's more regulatory scrutiny. There are more laws. The companies are responding. But Twitter's doing the exact opposite. And that's what has to - something has to be done about that.

MARTIN: Anne Collier and Eirliani Abdul Rahman, former members of Twitter's Trust and Safety Council. Thanks so much for your time.

RAHMAN: Thank you.

COLLIER: Thanks for having us.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.