The Wall Street Journal's 'Facebook Files' report, which contains different leaked information from within Facebook HQ, has raised a whole new set of issues for the social media giant.
The concept that Facebook puts celebrities and high-profile users to a different standard, with a separate moderation team double-checking their posts and updates and possibly leaving them up when they would've been removed if they were from regular people, was one of the issues raised in the multi-part investigation.
Facebook has originally explained this double-checking procedure, claiming that it ensures that the correct decision is made “so that [posts] are not erroneously removed or left up,” while also denying that the method gives these Pages any special treatment.
Yet, Facebook acknowledges that the method isn't flawless, and it referred a decision on the procedure to its independent Oversight Board this week in an effort to find a better approach to analyze such content in the future.
According to Facebook:
“Facebook reviews billions of pieces of content every day, has 40,000 people working on safety and security, and has built some of the most sophisticated technology to help with content enforcement. Despite that, we know we are going to make mistakes. The cross-check system was built to prevent potential over-enforcement mistakes and to double-check cases where, for example, a decision could require more understanding or there could be a higher risk for a mistake. This could include activists raising awareness of instances of violence, journalists reporting from conflict zones or other content from high-visibility Pages and profiles where correct enforcement is especially important given the number of people who could see it.”
So, according to Facebook, the procedure is in place to guarantee that high-profile, high-impact errors are prevented, which is why it has a secondary check process in place — not to give celebrities more freedom in publishing anything they want.
Facebook claims that it is constantly working to improve this process, with input from the Oversight Board helping to refine it.
“Holding Facebook accountable for our content policies and processes is exactly why the Oversight Board was established. Over the coming weeks and months, we will continue to brief the board on our cross-check system and engage with them to answer their questions.”
As a result of the WSJ investigation, the Oversight Board first requested that Facebook disclose more information about the cross-check process:
“At the Oversight Board, we have been asking questions about cross-check for some time. In our decision concerning former US President Donald Trump’s accounts, we warned that a lack of clear public information on cross-check and Facebook’s ‘newsworthiness exception’ could contribute to perceptions that Facebook is unduly influenced by political and commercial considerations.”
That pertains to Facebook's decision not to take action on comments made by former President Trump because they were newsworthy and relevant to the community.
Matter of fact, Facebook CEO Mark Zuckerberg emphasized this strategy in a speech to Georgetown in 2019, which sparked the initial controversy in this regard:
“We don’t fact-check political ads. We don’t do this to help politicians, but because we think people should be able to see for themselves what politicians are saying. And if content is newsworthy, we also won’t take it down even if it would otherwise conflict with many of our standards.”
Facebook's attitude typically errs on the side of free speech, but recent events have forced a rethinking of this, as well as a broader reassessment of Facebook's role in communication and its obligations in this regard.
Which is why the discoveries regarding Facebook's cross-check method are notable, as they appear to accord with the company's stated desire to allow more content to be shared in its apps while avoiding having to regulate it through their own ways.
In general, Facebook believes that there should be some type of official regulation in the social media sector, and that platforms should not be required to set such guidelines on their own. Which is probably the preferable way to go, but outside of Facebook's own efforts, which it's using to demonstrate the need for one, there's been no action in establishing an independent oversight body.
In an ideal scenario, another regulatory body would take such judgments away from Facebook, but for the time being, it is up to it, as well as each social network, to decide what is and is not acceptable, as well as the specific restrictions that go along with it.
Given that these are independent businesses reporting to their shareholders, this doesn't seem like the best strategy, especially as their clout grows by the day.
Independent oversight appears to be the inevitable answer, but regional differences and other complications present serious challenges in this regard.
That's why Facebook created its own Oversight board, and why it's now trying to refer such choices to them as a way to relieve some burden on its staff while also removing responsibility.
Which, while it may appear to be a fast way out in some ways, is actually the example for where we should be heading.
Whether you agree with it or not, Facebook's approach may be the best way to address its various concerns.