The co-chairman of the independent board charged with refereeing Facebook’s censorship decisions says the company prevents it from scrutinizing restrictions on the distribution of posts and other actions that are central to censorship on the platform.
Facebook’s parent company, Meta, allows the board to review decisions involving posts but won’t let the board review the distribution of content, actions taken against individual accounts or fact-checking, said Michael McConnell, co-chair of Meta’s oversight board.
The board is tasked with reviewing Meta’s “content moderation,” which Mr. McConnell described as a synonym for censorship.
Mr. McConnell, a former federal judge, said the 20-member board’s jurisdiction has increased since it started work in 2020 and he wants the portfolio to expand further.
“I bet I speak for everyone: That is something we should be able to look at because a downgrading of reach may not be total silencing, but it is in the same family,” Mr. McConnell said in an interview. “There are the same reasons why it would make sense to have an independent oversight of that.”
Downgrading the reach of content is key to Facebook’s approach to restricting speech it dislikes.
In one of the most publicized examples, Twitter blocked linking to reports by the New York Post and others about emails and documents found on Hunter Biden’s laptop computer. The restrictions effectively cut off the distribution of news that was potentially damaging to President Biden ahead of the 2020 election.
Facebook throttled the distribution of the news and touted its fact-checking program.
Whether Meta will allow the oversight board to review downgrading content reach on Facebook and Instagram is an open question. Meta declined to comment on the record for this article.
The board’s bylaws state that its activities “will grow and change,” but Meta largely calls the shots for the board’s access. Last year, Facebook expanded the scope of what it allowed the oversight board to review by permitting it to examine the content that remained on the company’s platforms in addition to appeals for content that was torn down.
Much of the board’s understanding of Facebook’s operations comes from external sources. Mr. McConnell said he learned Facebook was restructuring its business as Meta from a newspaper, and he said he has not talked to Meta CEO Mark Zuckerberg since joining the board.
The two complaints he hears most about Facebook are shutting down accounts for arbitrary reasons and the fact-checking on the platform, he said.
Mr. McConnell said the board negotiates with Meta to determine the boundaries of what it may review. He said those negotiations often include Meta Vice Presidents Nick Clegg, a former British deputy prime minister and leader of the United Kingdom’s Liberal Democrats, and Brent Harris, a California lawyer known for championing climate change causes. As Meta’s director of global affairs, he helped create the oversight board.
Whether restrictions on digital content’s reach violate the spirit of freedom of speech in America is also a hot topic in congressional debates about internet regulation.
The board’s most prominent decision in the U.S. so far is its review of Facebook’s ban of former President Donald Trump. The board upheld the ban in May but directed the company to undertake an additional review because, the board said, an indefinite suspension was inappropriate.
Facebook responded by extending Mr. Trump’s ban until at least 2023.
Mr. McConnell said Facebook’s two-year timeline for potentially allowing Mr. Trump’s return is an important difference from platforms that have implemented permanent and indefinite bans. Twitter has permanently banned Mr. Trump, and Google-owned YouTube has not defined a specific timeline for allowing the former president’s return.
Alongside efforts to get Facebook and Instagram to implement its decisions, the board is working to win the support of users and stakeholders around the world. Oversight board spokesman Dex Hunter-Torricke said he visited the Embassy of Spain in Washington and had conversations with European Union representatives about speech and content moderation.
“We are a minnow swimming in a much larger stream where there are lots of other people trying to find solutions to problems of what ails Meta and content moderation,” Mr. Hunter-Torricke said. “And our value in a lot of cases is going to be lifting the hood on Facebook and Instagram in each case, finding out information about the company, diagnosing those problems and then also allowing other actors to get involved.”
• Ryan Lovelace can be reached at rlovelace@washingtontimes.com.
Please read our comment policy before commenting.