Meta Introduces a New Fact-Checking Mentorship Program to Help Reduce the Negative Effects of Online
Meta, Facebook's parent company, has announced a new fact-checking mentorship program, developed in collaboration with the Poynter Institute's International Fact-Checking, to assist fact-checking organizations scale their operations and increase their impact.
According to Meta:
“Reducing the spread of misinformation is a challenge that no single organization can tackle alone. Strong partnerships with subject matter experts and sharing information on best practices plays a big role in effectively addressing misinformation. That’s why, as part of this global mentorship program, the nonpartisan IFCN will select up to 6 experts from the fact-checking industry to serve as mentors for up to 30 organizations in Meta’s third-party fact-checking program.”
The initiative will get $450,000 in funds from Meta, which will be used to strengthen fact-checking processes through shared education, with a particular focus on assisting more groups in more areas to combat detrimental trends.
Facebook, in particular, has come under fire for its role in spreading misinformation, with the recent Facebook Files leak highlighting the impact that its News Feed algorithm, in particular, can have on content amplification, through a variety of incentives and engagement-driving mechanisms that could aid to reveal more controversial content.
Meta has denied that its platform is to blame for the surge in divisiveness, with Nick Clegg, Vice President of Global Affairs and Communications at Meta Platforms, explaining:
“The increase in political polarization in the US pre-dates social media by several decades. If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.”
Even still, it's hard to argue that Facebook has exacerbated polarization, especially when you look at the top 10 most shared link articles on the app each day:
Facebook's algorithms are based on engagement, and content that gets the most engagement elicits an emotional response, with rage and joy being the most compelling.
As a result, it's possible that Facebook's own systems are fundamentally designed to encourage such participation as an economic benefit. That is why any effort to counteract misinformation is critical - we all have differing viewpoints, but reports that distort those viewpoints with lies and falsehoods only undermine democracy as a whole.
As such, this new project will be critical, and as Meta expands its usage of Facebook and Instagram in other areas, it will need to pair that with expanded efforts to combat localized misinformation in order to mitigate negative consequences.
Comments