(Reuters) - Facebook Inc on Wednesday announced the first 20 members of its independent oversight board that can overrule the company’s own content moderation decisions. Here are some key facts about how the board will work:
The board, which some have dubbed Facebook’s “Supreme Court,” will rule on whether some individual pieces of content should be displayed on the site. It can also recommend changes to Facebook’s content policy, based on a case decision or at the company’s request.
At first, the board will review posts, videos, photos and comments that the company has decided to remove from Facebook or its photo-sharing site Instagram, but eventually it will handle cases where content was left up.
This could be content involving issues such as nudity, violence or hate speech. Facebook has said the board’s remit will in future include ads, groups, pages, profiles and events, but has not given a time frame.
It will not deal with Instagram direct messages, Facebook’s messaging platforms WhatsApp, Messenger, its dating service or its Oculus virtual reality products.
Facebook expects the board will initially take on only “dozens” of cases, a small percentage of the thousands it expects will eventually be brought to the board. In 2019, users appealed more than 10 million pieces of content that Facebook removed or took action on.
But Facebook’s head of global affairs, Nick Clegg, told Reuters he thought the cases chosen would have a wider relevance to patterns of content disputes.
The board will decide which cases it reviews, which can be referred either by a user who has exhausted Facebook’s normal appeals process or by Facebook itself for cases that might be “significant and difficult.”
Users who disagree with Facebook’s final decision on their content will have 15 days to submit a case to the board through the board’s website.
Each case will be reviewed by a panel of five members, with at least one from the same geographic region as the case originated. The panel can ask for subject matter experts to help make its decision, which then must be finalized by the whole board.
The board’s case decision - which is binding unless it could violate the law - must be made and implemented within 90 days, though Facebook can ask for a 30-day expedited review for exceptional cases, including those with “urgent real-world consequences.”
Users will be notified of the board’s ruling on their case and the board will publicly publish the decision.
When the board gives policy recommendations, Facebook will give public updates and publish a response on the guidance and follow-on action within 30 days.
For more details on the board's operations, see Facebook's proposed bylaws here
The board will eventually have about 40 members.
Facebook chose the four co-chairs - former U.S. federal circuit judge Michael McConnell and constitutional law expert Jamal Greene from the United States, Colombian attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt - who then jointly selected the other 16 members named so far.
Some were sourced from the global consultations conducted by Facebook to obtain feedback on the oversight board.
The members, who will be part-time, so far include constitutional law experts, civil rights advocates, academics, journalists, a Nobel Peace Prize laureate and a former judge of the European Court of Human Rights.
The members will be paid by a trust that Facebook has created and will serve three-year terms for a maximum of nine years.
The trustees can remove a member before the end of their term for violating the board’s code of conduct, but not for content decisions.
Thomas Hughes, former executive director for freedom of expression rights group Article 19, has also been appointed to oversee the board’s full-time administrative staff.
Reporting by Elizabeth Culliford in Birmingham, England; Editing by Matthew Lewis