Facebook’s external body of decision makers will start reviewing cases about what stays on the platform and what goes beginning today.
The new system will elevate some of the platform’s content moderation decisions to a new group called the Facebook Oversight Board, which will make decisions and influence precedents about what kind of content should and shouldn’t be allowed.
According to Facebook, anyone who has appealed an eligible content moderation decision on Facebook or Instagram and has already gone through the normal appeals process will get a special ID that they can take to the Oversight Board website to submit their case.
Facebook says the board will decide which cases to consider, pulling from a combination of user-appealed cases and cases that Facebook will send its way. The full slate of board members, announced in May, grew out of four co-chairs that Facebook itself named to the board. The international group of 20 includes former journalists, U.S. appeals court judges, digital rights activists, the ex-prime minister of Denmark and one member from the Cato Institute, the libertarian think tank.
But as we’ve reported previously, the board’s decisions won’t just magically enact changes on the platform. Instead of setting policy independently, each recommended platform policy change from the oversight board will get kicked back to Facebook, which will “review that guidance” and decide what changes, if any, to make.
The oversight board’s specific case decisions will remain, but that doesn’t mean they’ll necessarily be generalized out to the social network at large. Facebook says it is “committed to enforcing the Board’s decisions on individual pieces of content, and to carefully considering and transparently responding to any policy recommendations.”
The groups’ focus on content taken down rather than content already allowed on the social network will also skew its purview. While a vocal subset of its conservative critics in Congress might disagree, Facebook’s real problems are about what stays online — not what gets taken down. Whether it’s violent militias connecting and organizing, political figures spreading misleading lies about voting or misinformation from military personnel that fuels targeted violence in Myanmar, content that spreads on Facebook has the power to reshape reality in extremely dangerous ways.
Noting the criticism, Facebook claims that decisions about content still up on Facebook are “very much in scope from Day 1” because the company can directly refer those cases to the Oversight Board. But with Facebook itself deciding which cases to elevate, that’s another major strike against the board’s independence from the outset.
Facebook says that the board will focus on reviewing content removals initially because of the way its existing systems are set up, but it aims “to bring all types of content outlined in the bylaws into scope as quickly as possible.”
“We expect them to make some decisions that we, at Facebook, will not always agree with – but that’s the point: they are truly autonomous in their exercise of independent judgment,” the company wrote in May.
Critics disagree. Facebook skeptics from every corner have seized on the oversight effort, calling it a charade and arguing that it doesn’t have as much power as Facebook would like people to think.
Facebook was not happy when a group of prominent critics calling itself the “Real Facebook Oversight Board” launched late last month. And earlier this year, a tech watchdog group called for the board’s five U.S.-based members to demand they be given more real power or resign.
Facebook also faced a backlash when it said the Oversight Board, which has been in the works for years, wouldn’t be up and running until “late fall.” But with just weeks to go before election day, Facebook has suddenly scrambled to get new policies and protections in place on issues that it’s dragged its feet on for years — the Oversight Board included, apparently.