Inside Meta’s Oversight Board: 2 Years of Pushing Limits
Ultimately, the issue came up in a March 2022 meeting with Clegg, who seemed taken aback by the board members’ frustration. He promised to break the logjam, and a few weeks later the board finally got the tool it should have had from the start. “We had to fight them to get it, which was baffling,” says Michael McConnell, a Stanford law professor who is one of the board’s cochairs. “But we did it.”
No sooner had that skirmish been resolved than another incident roiled the waters. When Russian troops invaded Ukraine last February, Facebook and Instagram were quickly overwhelmed with questionable, even dangerous content. Posts promoting violence, such as “death to the Russian invaders,” were in clear violation of Meta’s policies, but banning them might suggest the company was rooting for those invaders. In March, Meta announced that in this narrow instance, it would temporarily allow such violent speech. It turned to the board for backup and asked for a policy advisory opinion. The board accepted the request, eager to ponder the human rights conundrum involved. It prepared a statement and set up appointments to brief reporters on the upcoming case.
But just before the board announced its new case, Meta abruptly withdrew the request. The stated reason was that an investigation might put some Meta employees at risk. The board formally accepted the explanation but blasted it in private meetings with the company. “We made it very clear to Meta that it was a mistake,” says Stephen Neal, the chair of the Oversight Board Trust, who noted that if safety were indeed the reason, that would have been apparent before Meta requested the policy advisory opinion.
When I asked whether Neal suspected that the board’s foes wanted to prevent its meddling in a hot-button issue, he didn’t deny it. In what seemed like an implicit return blow, the board took on a case that addressed the very issues raised by Meta’s withdrawn advisory opinion. It involved a Russian-language post from a Latvian user that showed a body, presumably dead, lying on the ground and quoted a famous Soviet poem that reads, “Kill the fascist so he will lie on the ground’s backbone … Kill him! Kill him!”
Other members also noticed the mixed feelings inside Meta. “There are plenty of people in the company for whom we’re more of an irritation,” McConnell says. “Nobody really likes people looking over their shoulders and criticizing.”
Since the board members are accomplished people who were probably chosen in part because they aren’t bomb throwers, they’re not the type to declare outright war on Meta. “I don’t approach this job thinking that Meta is evil,” says Alan Rusbridger, a board member and former editor of The Guardian. “The problem that they’re trying to crack is one that nobody on earth has ever tried to do before. On the other hand, I think there has been a pattern of dragging them screaming and kicking to give us the information we’re seeking.”
There are worse things than no information. In one case, Meta gave the board the wrong information—which may soon lead to its most scathing decision yet.
During the Trump case, Meta researchers had mentioned to the board a program called Cross Check. It essentially gave special treatment to certain accounts belonging to politicians, celebrities, and the like. The company characterized it to the board as a limited program involving only “a small number of decisions.” Some board members saw it as inherently unfair, and in their recommendations in the Trump case, they asked Meta to compare the error rates in its Cross Check decisions with those on ordinary posts and accounts. Basically, the members wanted to make sure this odd program wasn’t a get-out-of-jail-free card for the powerful.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.