Facebook will review content policies after employee outrage
Following a tumultuous week, Facebook CEO Mark Zuckerberg said the company plans to review its “products and policies” after employees very publicly revolted over the decision to leave up a post by President Donald Trump.
Zuckerberg said in a post on Facebook that the decision to examine its policies, which allowed Trump’s “when the looting starts, the shooting starts” post to stay up, came after “feedback from employees, civil rights experts, and subject matter experts internally.”
He said Facebook will take a look at policies that examine “threats of state use of force” as well as how to handle posts that may incite voter suppression — two topics the social media company has come under fire for fiercely following 2016’s Cambridge Analytica scandal.
Zuckerberg’s post comes just one week after he defended Facebook’s decision to leave Trump’s inflammatory post up, even after Twitter labeled and blocked it for “glorifying violence.” In the days following, Facebook employees expressed their outrage over Zuckerberg’s inaction to remove the post, and over 5,000 staged a virtual walk-out.
The Facebook founder’s Friday post included a list of seven areas of policy the company plans to review, including “initiatives to advance racial justice and voter engagement.”
“[…] We’re going to review whether we need to change anything structurally to make sure the right groups and voices are at the table,” he wrote. “Not only when decisions affecting a certain group are being made, but when other decisions that may set precedents are being made as well. I’m committed to elevating the representation of diversity, inclusion, and human rights in our processes and management team discussions, and I will follow up soon with specific thoughts on how we can structurally improve this.”
He also said Facebook will make its decision-making process more transparent — however, did not clearly outline how — as well as the company’s options (leave up, or take down) when it comes to content that violates, or “partially” violates, community guidelines, as seen last week with Trump’s post.
“I know many of you think we should have labeled the President’s posts in some way last week,” wrote Zuckerberg. “Our current policy is that if content is actually inciting violence, then the right mitigation is to take that content down — not let people continue seeing it behind a flag. There is no exception to this policy for politicians or newsworthiness.”
One notable topic seemingly left off of Zuckerberg’s to-do list? Misinformation. Because of its lax guidelines, false information has often spread across Facebook faster than other social media platforms. Take for example the coronavirus conspiracy video “Plandemic,” which stayed up on the site for over a week before it was taken down.