Facebook plays a significant role in feeding political division
It's impossible to deny Facebook's influence in rising political division, and societal division more broadly. But how much, exactly, does Facebook contribute to political discussion, and what are the subsequent impacts of that on voting behavior?
At a recent conference in Munich, Facebook CEO Mark Zuckerberg played down Facebook's responsibility in this respect, saying that:
“People are less likely to click on things and engage with them if they don't agree with them. So, I don't know how to solve that problem. That's not a technology problem as much as it is a human affirmation problem."
Zuckerberg also noted that Facebook users are exposed to a more diverse set of perspectives on Facebook than they were before through traditional media, so Facebook, in his view, is not necessarily to blame for exacerbating division or fueling echo chambers, as some have suggested.
But still, even based on these statements, Facebook is contributing to division and political angst.
As noted by the former head of Facebook's Mobile Ads Division Andrew Bosworth earlier this year, while Facebook does expose users to more perspectives, according to its internal findings, that's not necessarily a good thing:
"The internet exposes them to far more content from other sources (26% more on Facebook, according to our research). This is one that everyone just gets wrong. The focus on filter bubbles causes people to miss the real disaster which is polarization. What happens when you see 26% more content from people you don’t agree with? Does it help you empathize with them as everyone has been suggesting? Nope. It makes you dislike them even more."
Essentially, both Bosworth and Zuckerberg acknowledge a key problem here, that Facebook does exacerbate political division through greater exposure to a broader spread of news content. Facebook might look to play this down, by putting the onus back on users and noting that what they read is their choice. But the facts are pretty clear - Facebook is knowingly exposing users to more content that causes angst. And that, once again, has been reflected in new statements and reports this week.
In a recent interview with NBC, Instagram chief Adam Mosseri made an interesting note about Instagram engagement in variance to Facebook:
"We have been able to learn from some of Facebook's mistakes. People feel a little bit better about their time on Instagram, probably because it's a bit more focused on things that are less contentious."
So, again, Facebook's executives are aware of the negative impacts that Facebook usage can have, and how, on other platforms, that's not as present. So that being the case, why doesn't Facebook work to address it? Why, if Facebook's team knows that increased exposure to more polarizing news content leaves users feeling less happy does it not look to revise its algorithm to address such?
Again, Facebook, as per Zuckerberg's statement above, might look to put the onus on users, and say that this is a 'human affirmation problem', but surely Facebook could exert some influence here. Surely the algorithm could be adjusted in accordance with these findings to build a more positive user experience, and reduce political friction.
Indeed, a new study published by OpenX has found that users are, in fact, feeling increasingly less happy about their time spent on Facebook.
Surely that's a concern, and an area that Facebook could improve upon, if it wanted to.
So why wouldn't it?
This finding, from a study into what makes content more shareable online, conducted back in 2010, could be relevant:
"The results suggest a strong relationship between emotion and virality: affect-laden content - regardless of whether it is positive or negative - is more likely to make the most emailed list. Further, positive content is more viral than negative content; however, this link is complex. While more awe-inspiring and more surprising content are more likely to make the most emailed list, and sadness-inducing content is less viral, some negative emotions are positively associated with virality. More anxiety- and anger-inducing content are both more likely to make the most emailed list. In fact, the most powerful predictor of virality in their model is how much anger an article evokes."
Anger is the most powerful predictor of virality. Content which incites anger is the most likely to be shared online.
This is where Facebook's defense of its systems, and its News Feed algorithm in particular, get a little shaky - in another section of NBC's interview with Mosseri, he notes that:
"[Facebook invests] more than anybody else does in these problems. You can disagree with specific policy decisions or enforcement decisions. But people who accuse us now of not having good intent, of not actually trying to take our responsibility seriously and not investing appropriately to fix those challenges, are just not looking at the actual facts."
And that's true to a large degree. But Facebook is also a business, and one that's seen continued growth, year-on-year. And as all media outlets know, driving emotional response is what maximizes engagement. Facebook knows this too, and with anger driving the most engagement, it seems logical that Facebook would, at the least, be happy to turn a blind eye to the potential negative impacts of such as it helps the platform drive more interaction and time spent.
Maybe that's why Facebook has been keen to push groups so hard in recent times - if more people take more of their divisive conversations into private groups, that limits exposure to the same in the News Feed, which would enable Facebook to benefit from the emotional/angry response to such posts while also lessening the broader impact on less interested users.
You might not have a major stake in, say, the Presidency of Donald Trump, but seeing a misguided post from your uncle supporting Trump could trigger an emotional response. Shift that Uncle's posting into a private group and the exposure risk is lessened.
Maybe that's part of Facebook's groups strategy?
It's difficult to say how much influence Facebook has in this respect, and how it can be resolved, but with so much of the internet being fueled by divisive, sensationalized content, it's really little wonder why political division has become so extreme. And Facebook, no matter how its execs might looks to spin it, does play a significant role in this.
Could this be resolved by changing the algorithm? Removing the algorithm altogether? What impacts would that then have for Facebook engagement?
The balance of responsibility versus benefit is a significant consideration in this respect.
Comments