An in-depth analysis of the contentious cross-check method used by Facebook and Instagram has been made public by Meta’s Oversight Board, which also urges Meta to increase program transparency and budget allocation.
Cross-check, which offers a unique moderation queue for prominent public people, like former president Donald Trump before his suspension from Facebook, was criticized by the semi-independent Oversight Board for having “many problems.” It pointed out instances where prohibited content — specifically non-consensual pornography in one instance — was left up for an extended period of time as well as a failure to make it obvious when accounts are protected by special cross-check status. Additionally, it faulted Meta for failing to maintain moderation statistics that may be used to judge the veracity of the program’s output.
“While Meta told the board that cross-checks aim to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns,” the report says. “The board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm.”
“It was protecting a limited number of people who didn’t even know that they were on the list.”
The report comes more than a year after The Wall Street Journal revealed details about cross-checking publicly. Following its revelations, Meta asked the Oversight Board to evaluate the program, but the board complained that Meta had failed to provide important information about it, like details about its role in moderating Trump’s posts. Today’s announcement follows months of back-and-forth between Meta and the Oversight Board, including the review of “thousands” of pages of internal documents, four briefings from the company, and a request for answers to 74 questions. The resulting document includes diagrams, statistics, and statements from Meta that help illuminate how it organized a multi-layered review program.
“It’s a small part of what Meta does, but I think that by spending this amount of time and looking into this [much] detail, it exposed something a bit more systemic within the company,” Oversight Board member Alan Rusbridger tellsThe Verge. “I sincerely believe that there are a lot of people at Meta who do believe in the values of free speech and the values of protecting journalism and protecting people working in civil society. But the program that they had crafted wasn’t doing those things. It was protecting a limited number of people who didn’t even know that they were on the list.”
Cross-check is designed to prevent inappropriate takedowns of posts from a subset of users, sending those decisions through a set of human reviews instead of the normal AI-heavy moderation process. Its members (who, as Rusbridger notes, aren’t told they’re protected) include journalists reporting from conflict zones and civic leaders whose statements are particularly newsworthy. It also covers “business partners” that include publishers, entertainers, companies, and charitable organizations.
According to statements from Meta that are quoted in the report, the program favors under-enforcing the company’s rules to avoid a “perception of censorship” or a bad experience for people who bring significant money and users to Facebook and Instagram. Meta says that on average it can take more than five days to make a call on a piece of content. A moderation backlog sometimes delays the decisions even further — at the longest, one piece of content remained in the queue for over seven months.
The Oversight Board has frequently criticized Meta for overzealously removing posts, particularly ones with political or artistic expression. But in this case, it expressed concern that Meta was allowing its business partnerships to overshadow real harm. A cross-check backlog, for instance, delayed a decision when Brazilian soccer player Neymar posted nude pictures of a woman who accused him of rape — and after the post, which was a clear violation of Meta’s rules, Neymar didn’t suffer the typical penalty of having his account deleted. The board notes that Neymar later signed an exclusive streaming deal with Meta.
Conversely, part of the problem is that ordinary users don’t get the same hands-on moderation, thanks to Facebook and Instagram’s massive scale. Meta told the Oversight Board that in October of 2021, it was performing 100 million enforcement actions on content every day. Many of these decisions are automated or given very cursory human review since it’s a vast volume that would be difficult or impossible to coordinate across a purely human-powered moderation system. But the board says it’s not clear that Meta tracks or attempts to analyze the accuracy of the cross-check system compared with ordinary content moderation. If it did, the results could indicate that a lot of ordinary users’ content was probably being inaccurately flagged as violating the rules, or that Meta was under-enforcing its policies for high-profile users.
“I hope that Meta will hold its nerve.”
The board made 32 recommendations to Meta. (As usual, Meta must respond to the recommendations within 60 days but is not bound to adopt them.) The recommendations include hiding posts that are marked as “high severity” violations while a review is underway, even when they’re posted by business partners. The board asks Meta to prioritize improving content moderation for “expression that is important for human rights,” adopting a special queue for this content that is separate from Meta’s business partners. It asks Meta to set out “clear, public criteria” for who is included on cross-checking lists — and in some cases, like state actors and business partners, to publicly mark that status.
Some of these recommendations, like the public marking of accounts, are policy decisions that likely wouldn’t require significant extra resources. But Rusbridger acknowledges that others — like eliminating the backlog for cross-checking — would require a “substantial” expansion of Meta’s moderation force. And the report arrives amid a period of austerity for Meta; last month, the company laid off around 13 percent of its workforce.
Rusbridger expresses hope that Meta will still prioritize content moderation alongside “harder” technical programs, even as it tightens its belt. “I hope that Meta will hold its nerve,” he says. “Tempting as it is to sort of cut the ‘soft’ areas, I think in the long term, they must realize that’s not a very wise thing to do.”
Subscribe to AFK Free Media on Google News.
You have made the point!
Nicely put. Thanks.
You revealed this really well!
Nicely put, Appreciate it.