Meta’s oversight board has issued a comprehensive report on Facebook and Instagram’s controversial cross-checking system, and called on Meta to “radically” make the program more transparent and strengthen its resources.
The semi-independent Oversight Board cited “several flaws” in cross-checking, which provides a special moderation queue for high-profile public figures, including former President Donald Trump before his suspension from Facebook. It pointed to failure to clarify when accounts are protected by a special cross-check status, as well as instances where rule-breaking material — particularly a case of non-consensual pornography — was left over an extended period of time. And it criticized Meta for not tracking moderation metrics that could judge the accuracy of the program’s results.
“While Meta told the board that cross-checking aims to advance Meta’s human rights commitments, we found that the program appears to be more directly structured to meet business interests,” the report said. “The board understands that Meta is a business, but by providing additional protections to certain users selected largely on the basis of business interests, cross-checking ensures that content that would otherwise be quickly removed remains for a longer period of time, which may cause damage.”
“It protected a limited number of people who didn’t even know they were on the list.”
The report comes more than a year later The Wall Street Journal revealed cross-check details public. Following the revelations, Meta asked the Oversight Board to evaluate the program, but the board complained that Meta had failed to provide important information about it, such as details about its role in moderating Trump’s posts. Today’s announcement apparently follows months of back-and-forth between Meta and the Oversight Board, including the review of “thousands” of pages of internal documents, four company briefings, and a request for answers to 74 questions. The resulting document includes diagrams, statistics, and explanations from Meta that help clarify how it organized a multi-tiered review program.
“It’s a small part of what Meta does, but I think by spending so much time and researching this [much] detail, it exposed something more systemic within the company,” said Alan Rusbridger, member of the Oversight Board The edge. “I honestly believe that there are many people at Meta who believe in the values of freedom of expression and the values of protecting journalism and protecting people who work in civil society. But the program they made didn’t do those things. It protected a limited number of people who didn’t even know they were on the list.”
Cross-checking is designed to prevent inappropriate deletions of posts from a subset of users by putting those decisions through a series of human reviews instead of the normal AI-heavy moderation process. Members (who, as Rusbringer points out, are not told they are protected) include journalists reporting from conflict zones and civic leaders whose statements are particularly newsworthy. It also covers “business partners”, including publishers, entertainers, corporations and charitable organizations.
According to Meta statements cited in the report, the program favors enforcing the company’s rules to avoid a “perception of censorship” or a bad experience for people who bring a lot of money and users to Facebook and Instagram . Meta says it can take more than five days on average to make a call for a piece of content. A moderation backlog sometimes delays decisions even further – at its longest, one piece of content remained queued for more than seven months.
The Oversight Board has regularly criticized Meta for overzealously removing posts, particularly those with political or artistic expression. But in this case, it raised concerns that Meta was allowing its business partnerships to overshadow the real damage. A cross-check backlog, for example, postponed a decision when Brazilian footballer Neymar posted nude photos of a woman accusing him of rape – and after the post, which was a clear violation of Meta’s rules, Neymar didn’t get the typical punishment of having his account deleted. The board notes that Neymar later signed an exclusive streaming deal with Meta.
Conversely, part of the problem is that regular users do not get the same hands-on moderation, thanks to the sheer scale of Facebook and Instagram. Meta told the board of trustees it was conducting 100 million content enforcement actions every day by October 2021. Many of these decisions are automated or given very volatile human review, as it is a huge volume that is difficult or impossible to coordinate in a purely human-driven moderation system. But the board says it’s not clear that Meta tracks or tries to analyze the accuracy of the cross-checking system compared to regular content moderation. If that were the case, the results could indicate that many regular users’ content was likely inaccurately flagged as violating the rules, or that Meta was not sufficiently enforcing its policy for high-profile users.
“I hope Meta will keep their spirits up.”
The board made 32 recommendations to Meta. (As usual, Meta must respond to the recommendations within 60 days, but is under no obligation to adopt them.) The recommendations include hiding posts flagged as “very serious” violations while a review is in progress, even if they partners placed by the company. The board is asking Meta to prioritize improving content moderation for “expression important to human rights,” adopting a dedicated queue for this content that is separate from Meta’s business partners. It asks Meta to establish “clear, public criteria” for who is on cross-check lists — and in some cases, such as government actors and business partners, to publicly flag that status.
Some of these recommendations, such as the public flagging of accounts, are policy decisions that are unlikely to require significant additional resources. But Rusbridger acknowledges that others — such as getting rid of the cross-check backlog — require a “substantial” expansion of Meta’s moderation power. And the report comes in during a period of budget cuts for Meta; last month, the company laid off about 13 percent of its workforce.
Rusbridger expresses hope that Meta will still prioritize content moderation alongside “harder” tech programs, even as it tightens its belt. “I hope Meta will keep the spirits up,” he says. “As tempting as it is to kind of cut off the ‘soft’ areas, I think in the long run they need to realize that this isn’t a very wise thing to do.”