top of page
MaryGrace Lerin

Facebook Launches New Experiment to Curb the Spread of Divisive Political Content in User Feeds

In response to continued user feedback, Facebook has released a new experiment in which it would de-emphasize political postings and updates on current events in user feeds. Facebook is also limiting the amount of political content that people could see in their News Feeds, and this could have a substantial impact on overall interaction both on and off the platform.

In response to continued user feedback, Facebook has released a new experiment in which it would de-emphasize political postings and updates on current events in user feeds. Facebook is also limiting the amount of political content that people could see in their News Feeds, and this could have a substantial impact on overall interaction both on and off the platform.



According to Axios:


“Moving forward, Facebook will expand some of its current News Feed tests that put less emphasis on certain engagement signals, like the probability that a user will share or comment on a post, in its ranking algorithm. Instead, it will begin placing a higher emphasis on other types of user feedback, like responses to surveys.”


As per Axios, the change is a continuation of a test Facebook conducted earlier this year, in which it reduced the quantity of political content in select users' News Feeds.

Facebook has been working on the experiment with a ‘small percentage of users’ in the US since February 17th, in response to worries about the influence of heated political conversations on the platform.


Facebook claims that it now has enough data to demonstrate that this could be a feasible development path.


“We’ve seen positive results from our tests to address the feedback we’ve received from people about wanting to see less political content in their News Feed. As a result, we plan to expand these tests to Costa Rica, Sweden, Spain and Ireland.”


The experiment also revealed that "certain engagement signals can better indicate what postings individuals find more valuable than others," according to Facebook.


“Based on that feedback, we’re gradually expanding some tests to put less emphasis on signals such as how likely someone is to comment on or share political content. At the same time, we’re putting more emphasis on new signals such as how likely people are to provide us with negative feedback on posts about political topics and current events when we rank those types of posts in their News Feed.”


As a result, the improved algorithm approach – which was tested in four countries in the initial test stage – will place less focus on content that elicits negative emotional responses and comments. This is a prominent criticism directed at Facebook over time: its algorithm essentially incentivizes controversial argument by magnifying postings that are more likely to provoke back and forth discussion, frequently furious and divisive in kind, since not all ‘engagement' is positive in this regard.


Which might be a useful update - since not all engagement is beneficial, and if Facebook uses ‘engagement' as a general proxy for interaction, and attempts to feed it in any way it can, regardless of what that engagement might actually be, then that implies that Facebook is frequently amplifying information that provokes dispute, because the system just sees that people are commenting and interacting.


Which makes sense in terms of pure interaction and keeping people engaged in the app. However, this poses an issue in that it encourages artists and publishers to post more ‘hot takes' to gain that dopamine rush from the subsequent reactions and alerts, and also more reach and clicks from a publishing perspective.


People want to feel as if their voices are being heard, and social media gives them that opportunity. However, if you don't say something that attracts attention — such as a provocative statement, a hilarious remark, or an inspirational phrase – your chances of gaining traction and receiving the accompanying buzz of alerts is slim to none.

That's why everyone on social media is a comedian, a life coach, or a political pundit, as that's what gets people's attention, and that attention, in a political sense, often gives rise to division, as the algorithms boost it based on engagement, prompting more people to basically pick a stance in the debate.


And these are typically issues in which many users have no genuine interest, but once you start participating with a topic, the algorithm will show you more of it, and your Facebook feed will soon become a dizzying array of political turmoil, which is based only on people's craving for attention and the thrill of responding in the app.


This update may help to resolve this by putting less focus on your probability to comment and taking into account the growing direct feedback Facebook receives from users who wish to see fewer political posts in their feed.


Mark Zuckerberg, the CEO of Facebook, stated in February:


"One of the top pieces of feedback we're hearing from our community right now is that people don't want politics and fighting to take over their experience on our services. So one theme for this year is that we're going to continue to focus on helping millions more people participate in healthy communities and we're going to focus even more on being a force for bringing people closer together."


This new update is the next step in this study, and it might have a substantial effect, given the strong response Facebook received when it reduced political information in user feeds following the US election. That move resulted in what Facebook employees dubbed the "nicer" News Feed, which reduced the intensity of argument and conflict across the board.

Is it possible that this modification will cause a similar shift throughout all of Facebook?

Ultimately, it appears to be a favorable test in either case, but given the scale of the Facebook experiment, some people will lose out.


The improvements, according to Facebook, will have a broader impact on public affairs content, and publishers will likely witness a drop in traffic as a result.


“Knowing this, we’re planning a gradual and methodical rollout for these tests, but remain encouraged, and expect to announce further expansions in the coming months.”


To put it another way, don't expect a dramatic shift on your Facebook feed anytime soon, but it is the start of a major experiment that could have a substantial impact on how Facebook engagement works.


Will this have an influence on Page's overall reach? At this point, it appears that it will primarily be focused on political content, which could aid Pages in other categories because there may be more room to fill in feeds as a replacement. But it's hard to gauge because Facebook is only just getting started with the next level of testing, so it's difficult to say what the full implications will be.


As a whole, it appears that The Social Network is making a constructive effort, responding to long-standing complaints and moving in a more measured, positive direction.

Because, while Facebook has attempted to refute claims that it has aided in the fueling of divisive, negative movements through content amplification - whether knowingly or unwittingly - the overwhelming proof implies that it has, and that its News Feed algorithm, which, once again, incentivizes content that will generate the most debate and discussion, has altered the incentive structure for publishers causing them to gravitate toward more emotional headlines and reports.


It has changed focus in this regard, and with the majority of people having a Facebook account, this can have, and arguably has had, a transformative impact on society perspective, exacerbating existing divides.

As a result, this might be a big step forward, with implications well beyond Facebook.

5 views0 comments

Comments


Discover Roundabout's free reporting tool for every social media campaign

Download the app

Apple and Android

apple_google_edited.png
apple_google_edited.png
bottom of page