MENLO PARK, Calif. — Ten months after Facebook announced it had established an international entity named the "Oversight Board" to make decisions about free speech and content moderation on its platforms — which include Instagram — one of its 20 original members has admitted the quasi-judicial board is “frustrated” and has yet to gain access to information about the company’s curation algorithm.
He also suggested that the board might want to create an area called “the sin bin” for content Facebook and Instagram find objectionable.
Alan Rusbridger, the former editor of liberal Anglo-American news organization The Guardian, told the U.K. House of Lords' communications and digital committee on Tuesday that “we’re already a bit frustrated by just saying ‘take it down’ or ‘leave it up,’” referring to a high-profile review submitted to the board concerning former U.S. President Donald Trump and his followers.
According to Rusbridger, the Oversight Board — the creation of which was announced without much fanfare by Facebook last May while the world was preoccupied with adapting to the new challenges of the post-COVID-19 reality — felt limited by its mandate to “simply assess Facebook’s decisions to remove or retain content.”
“What happens if you want to make something less viral?” asked Rusbridger. “What happens if you want to put up an interstitial? What happens if, without commenting on any high-profile current cases, you didn’t want to ban someone for life but wanted to put them in a ‘sin bin’ so that if they misbehave again you can chuck them off?”
It is unclear what this “sin bin” proposal by Rusbridger refers to inside the currently known structure of Facebook and social media content.
“These are all things that the board may ask [of] Facebook in time,” Rusbridger added. “But we have to get our feet under the table first, and prove that we can do what we want. At some point we’re going to ask to see the algorithm, I feel sure, whatever that means. Whether we’ll understand when we see it is a different matter.”
Admitting Ignorance About How Platforms Work
Rusbridger also told the House of Lords that these changes in the way the Oversight Board operates would come only after it expanded to 40 members.
This Oversight Board — named simply that and not "Facebook's Oversight Board" — is funded by Facebook, who appointed its four co-chairs. Those co-chairs then appointed 20 members, with 20 more members yet to be appointed almost a year after its formation.
The board, according to the Guardian, is "in the process of finding another 20 board members without Facebook’s direct involvement."
"I think we need more technology people on the board who can give us independent advice from Facebook,” Rusbridger told the British Lords, further admitting the lack of technical expertise in the current body assembled by Facebook. “Because it is going to be a very difficult thing to understand how this artificial intelligence works.”
“People say to me, ‘Oh, you’re on the board, but it’s well-known that the algorithms reward emotional content that polarizes communities because that makes it more addictive.’ Well, I don’t know if that’s true or not,” Rusbridger candidly admitted.
“As a board we’re going to have to get to grips with that. Even if that takes many sessions with coders talking very slowly so that we understand them, I think we need to understand what these machines are,” he concluded.
A Self-Regulating Group
Last October 23, shortly before the U.S. presidential election, the non-governmental, pseudo-judicial entity declared that “from today, if your content is removed from Facebook or Instagram and you have exhausted the company’s appeal process, you can challenge this decision by appealing to the Oversight Board.”
“Similarly, Facebook can now refer cases for a decision about whether content should remain up or come down. In the coming months you will also be able to appeal to the Board about content you want Facebook to remove,” the Oversight Board announced.
The announcement came with a flowchart explaining how this self-regulating group of people would be reviewing cases, including content and account deletions by the third-party moderators that Facebook and Instagram use.
Paraphrasing the "Spider-Man" comic books, Facebook’s VP of Global Affairs and Communications (and former U.K. MP) Nick Clegg explained that “with our size comes a great deal of responsibility and while we have always taken advice from experts on how to best keep our platforms safe, until now, we have made the final decisions about what should be allowed on our platforms and what should be removed. And these decisions often are not easy to make — most judgments do not have obvious, or uncontroversial, outcomes and yet many of them have significant implications for free expression.”
To visit the Oversight Board and submit complaints about unfair deletions of content or accounts on Facebook or instagram, click here.
Main Image: Alan Rusbridger (Photo: The Guardian)