In November, Facebook announced its plans to create an external content oversight board that would serve as something of a “supreme court” for Facebook’s more controversial content policy decisions. With the liberation of the draft articles charter in January, the company took the first steps to describe how this content review committee has been in operation. Today, Facebook is opening a public consultation process to help it answer more questions around the oversight board’s design.
Over the next six weeks, Facebook says it will accept submissions from the general public about its plans.
Public submissions will be integrated into two parts: a questionnaire and free-form questions.
The latter will focus on gathering input around membership, case decisions and governance. The questionnaire parcel, nonetheless, is a much simpler user investigation where players are asked to vote on various aspects( many already detailed in the draft charter) — like how many total members should members of the security council have, how long they should dish, how should they gain their positions and what their makeup should be in terms of background, professional suffer, views and diversity , among other things.
It likewise requests the public to weigh in on how the omission board will make decisions on lawsuits, with questions about how their rulings will impact policy, what subject matter experts they can consult, whether they should review written sentiments from those the event affects, the precedent to be decided by prior rulings and more.
Some of these questions are simpler to answer, while others may hold survey respondents pause.
For example, one asks if it’s more important for members of the security council to dedicate more duration and research to each case or if it should prioritize establishing more decisions per year?
While certainly all cases refreshed should be well-researched, if you believe Facebook’s board should resemble the U.S. tribunals organization, then there should be guidance around how long its board members have to make a decision. Otherwise, it could see some of the most severe content policy decisions tied up in never-ending deliberations, with members of the security council citing” more study is requirement” to govern. That wouldn’t be fair to those whose content is held hostage in the meantime. But the question doesn’t allow for this degree of subtlety — so you’d is therefore necessary to take to the paper portion to share this position.
However, anyone can take the results of the survey section of the questionnaire and can elect to skip the essay segment if they don’t have more to add.
Facebook says it has partnered with the firm Baker McKenzie, which will help it review the information submitted. The responses will be summarized in each of these reports are presented in June.
For Facebook, the launch of an independent review board allows the company to further distance itself from controversial policy decisions.
As we noted in January, decision-making around content removals is an area where Facebook has repeatedly and publicly flunked, and with disastrous consequences. The company has been widely criticized for how it managed issues like the calls to savagery that led to genocide in Myanmarand riots in Sri Lanka ; election intruding from state-backed actors from Russia, Iran and elsewhere; its failure to remove child abuse posts in India; the weaponization of Facebook by the government in the Philippines to stillness its reviewers; Facebook’s approach to handling Holocaust self-denialsor conspiracy theoreticians like Alex Jones; and more.
The board’s creation will not only take the pressure off Facebook to construct these decisions, it’s also an admission of sorts that Facebook agrees it’s no longer able to handle this grade of responsibility.
Those interested in sharing their own reckons around the review committee can go here for the results of the survey .