Elon Musk has only recently had Twitter under control, but he is already making great strides. Musk fired several key executives on day one, including Twitter CEO Parag Agrawal, but in a new tweet he claims he will be slowing down when it comes to making content moderation decisions.
Musk hasn’t said much since he took over Twitter, but he will apparently form some sort of policy advisory body to oversee content moderation decisions. Musk said the group will reflect “different points of view”, though we’ll definitely have to wait and see. “To be clear, we have not made any changes to Twitter’s content moderation policy yet,” Musk tweeted later Friday night.
Importantly, Musk says he will not make any major decisions or reinstate his bill — that is, reinstate former President Donald Trump — until the council is established. Because it’s Musk, that could happen in a matter of hours or not happen at all, that’s hard to say. Hours later, Musk undermined his own claims of a formalized policy decision-making system, using his own power to make major calls for content moderation.
Responding to controversial right-wing academic and self-help author Jordan Peterson’s daughter, Musk made the sweeping claim that “anyone suspended for minor and questionable reasons will be released from Twitter jail.” Peterson’s Twitter account was restricted after he hit a transphobic rant about actor Elliot Page earlier this yearso in this context, tweeting hate about a trans person and his doctor counts as “small and iffy” in Musk’s book, apparently.
On Thursday, Musk also released Vijaya Gadde, a respected top politician at the company who helped navigate complex legal and moderation issues for more than 11 years. Getting rid of Gadde was a signal that a new era is dawning with different decision-making, both for better and for worse.
The tweet is probably more balm for skittish advertisers wary of Musk immediately turning the platform into an all-encompassing mess of harassment, hatred and misinformation. While Twitter arguably already fits that description with the degree of moderation in place, advertisers are watching for major shifts in the type of content allowed on the platform and how it could adversely affect their brands.
Musk may think this is an original idea, but Twitter is already consulting a trust and security council to advise its product and policy decisions. The council – which is already called a council – initially consisted of 40 organizations and experts who advised him on challenging policy areas. That group had more of an advisory role and, unlike Meta’s Oversight Board, was not designed to make binding decisions.
First announced in 2016, Twitter expanded the entity in 2020 to form groups dedicated to specific difficult topics, including safety and online harassment, digital rights, child sexual exploitation, and suicide prevention. “A lot of what we are currently doing, such as ongoing meetings with NGOs, activists and other organizations, is always part of our process, but we haven’t done enough to share that externally,” Twitter wrote at the time.
It’s possible that Musk has something more like the Board of Trustees in mind when it comes to making content moderation decisions, but everything from the people who will serve on a hypothetical board to the nature of the group’s impact is probably controversial.