Skip to main contentSkip to navigationSkip to navigation
Illustration of Facebook surveillance by Matt Kenyon
‘Facebook, once resistant to the notion of being a media company, has to make many more decisions now based on optics and PR.’ Photograph: Matt Kenyon
‘Facebook, once resistant to the notion of being a media company, has to make many more decisions now based on optics and PR.’ Photograph: Matt Kenyon

Facebook has beefed up its ‘oversight board’, but any new powers are illusory

This article is more than 3 years old
Emily Bell

The social media giant is still trying to navigate controversial content, yet the problem remains the platform itself

Any day now the Facebook Oversight Board, the social media company’s review mechanism for decisions on deletion of content and users, will tell the world whether Donald Trump should remain banned from Facebook indefinitely. Before that, Facebook also announced the expansion of the Oversight Board’s remit: initially users could only appeal to have content restored after moderators removed it – now the board will also examine appeals over content that has been left up on Facebook or Instagram following a review by moderators.

With the Oversight Board, the uncomfortable decisions about which controversial material to leave up and what to take down is partly removed from the hands of Facebook policymakers, the executive leadership and the low-paid human moderation staff, and given to a panel of respected experts who become the editorial tone-setters for global media. Now Facebook will make decisions on what it keeps and what it deletes in step with the decisions of an elite board, paid and protected by the company but operating “independently”.

The expansion of the board’s power comes at a time when it has barely cleared its throat on operational matters: it has made seven decisions, with a big one pending. Hardly a track record. The shift is not unexpected though: legal scholar Kate Klonick wrote in the New Yorker that an expansion of this kind would happen in “mid 2021”, but the timing seems more than coincidental to the Trump decision.

The Facebook Oversight Board was set up as a “supreme court” but its role in refining what can and cannot be left on the site is far more like an editorial board at a newspaper. It sits separately from the main business of the organisation – advertising. It shifts focus and responsibility for all key moderation decisions to “experts”. It acts as a taste-making panel for the rest of the company and cover for more granular issues that will continue to dog any social platform. There is not much transparency around the internal machinations between Facebook executives and the Oversight Board.

Facebook’s content decisions are increasingly its brand. In the Oversight Board it has a backstop for the kind of decisions the company cannot scale and smoothly enact across the world: which politicians’ accounts to take down and which to leave up; how to deal with certain persistent harassers who are ruled not to breach policy; whether to take down offensive material in one country but leave it up in another. And, most importantly, it is a mechanism through which to respond to the kind of public pressure that is a drain on management time, provokes congressional hearings, and upsets employees. To date, Facebook’s attempts to produce a “scalable” content moderation strategy for global speech has been a miserable failure, as it was always doomed to be, because speech is culturally sensitive and context specific.

After a decade of denying Facebook was responsible for, or even capable of, making content decisions beyond the broadest sweep of generalised rules, Mark Zuckerberg has shifted the company more into the territory of all historic media powers: making arbitrary decisions on hot topics in step with the prevailing cultural and political forces of the time. Ultimately, there is no other way that Facebook can operate, but accepting the position means abandoning some core beliefs.

Facebook is not a news company – it employs no reporters – but it is a news-driven company. Two years ago, I asked a Facebook executive who was actually responsible for coming in every morning and worrying about the global news cycle, the election pressures, the trending stories, the regional sensitivities. I got a long answer that essentially boiled down to: parts of many departments, led by policy. However, Facebook having any pre-emptive alarm for sensitive situations was unusual, until the pandemic and the US election changed attitudes.

The case of Sophie Zhang, a former data scientist at Facebook, reads like the stories of many of the whistleblowers who have left the tech industry. Zhang had been hired in January 2018 to work on a new team combating “fake engagement”, but found herself often “emptying the ocean with an eyedropper”. Her concern that the company was not acting quickly or consistently enough to stop politicians abusing the Facebook platform to make themselves appear more popular than they actually were, led to frustration, clashes with her superiors, and then Zhang’s dismissal. Her concerns, outlined in a farewell post, were elucidated in her first interview with the Guardian’s Julia Carrie Wong.

Where Zhang’s testimony is so important is not necessarily in what Facebook disputes – that it hasn’t dedicated enough resources to the problem of rooting out coordinated inauthentic behaviour – but in the undisputed details of how Facebook creates priorities. Instances of changing policies in response to or in anticipation of public reaction, or regulatory difficulty, point to the fact that Facebook, once resistant to the notion of being a media company, has to make many more decisions now based on optics and PR.

Addressing a Facebook summit on civic integrity in 2020, Zhang told colleagues about her discomfort at the company’s slow response to evidence of banned behaviour from Azerbaijani accounts attacking opposition politicians and independent journalists. “I’ve been told directly by leadership that I should ignore these cases because if they are impactful, we’ll eventually receive PR flak over it and motivate a change,” noted Zhang at the time. “The assumption is that if a case does not receive media attention, it poses no societal risk … What is our responsibility when societal risk diverges from PR risk?”

Within these sentences lies an explanation of how Facebook is slowly and somewhat painfully re-engineering itself, and in doing so forging a template for new media gatekeeping which is not a million miles away from old media gatekeeping. Facebook has found itself repeatedly responding to a press cycle it dominates far more than it would like.

At least some of this shift can be credited to Britain’s former deputy prime minister Nick Clegg, who is now Facebook’s head of global communications. This involves a mitigation strategy that looks very familiar to anyone with a background in British journalism: one seemingly focused on the creation of a circle of trusted journalists (and non-journalists) who are drip-fed access, with favoured sources given off-the-record briefings; meanwhile, pressure is applied and access restricted to editors and journalists who disappoint. And if you cannot beat the media, you can now at least be the media. In March, Clegg wrote an enormously long piece advancing Facebook’s PR talking points: namely that it is human behaviour, not platform design, that causes political division. To prove it, he cites numerous studies without mentioning that a number come from academics and institutions that have received either Facebook funding or privileged access to Facebook data in the past.

The post was not published on the Facebook news blog, or in a Facebook post, but on a separate platform entirely, Medium. But despite such efforts, Clegg’s separation from the platform, like the Oversight Board’s independence, is illusory. However, it demonstrates a new truth for Facebook: the company is tackling the impact problem first, because its design problem is unsolvable.

  • Emily Bell is director of the Tow Center for Digital Journalism at Columbia University’s Graduate School of Journalism and a Guardian columnist

Most viewed

Most viewed