Facebook removes more accounts due to manipulation efforts
With the US Presidential Election looming, Facebook has this week outlined its latest round of account removals due to 'coordinated inauthentic behavior' - or, in other words, groups that have sought to use Facebook's tools to manipulate Facebook's users and their subsequent activities.
This latest set of removals, which includes operations originating from Canada, Brazil and Ukraine, also, includes a group connected to the far-right Proud Boys in the US, which Facebook initially banned back in 2018.
As per Facebook:
"The people behind this activity used fake accounts - some of which had already been detected and disabled by our automated systems - to pose as residents of Florida, post and comment on their own content to make it appear more popular than it is, evade enforcement, and manage Pages."
The group also posted conspiracy theory posts, spreading misinformation through their networks.
Facebook says that the Pages and profiles had seemingly purchased followers to inflate their presence, but overall, the banned cluster consisted of 54 Facebook accounts, 50 Pages, and 4 accounts on Instagram.
"Around 260,000 accounts followed one or more of these Pages, and around 61,500 people followed one or more of these Instagram accounts."
That's significant - but even more than this, Facebook says that this cluster of accounts spent "less than $308,000" on Facebook and Instagram ads.
"Less than $308,000" seems like an odd way to put it - these pages spent a huge amount on promoting their posts, and pushing their various agendas.
To get an estimate on the impact of such activity, I put $308k as my ad budget into Facebook ads manager just now, and it estimated that my ad would reach around 17.8 million people. And while there is more to it than that (relating to targeting, scheduling, etc.), spending that amount of money would have enabled this group to reach a lot of people, spreading misinformation and hate speech throughout the platform.
Facebook has removed the group, so this particular issue has been addressed. But those loose estimates once again underline the potential scope of such activities, and how Facebook can be weaponized for indoctrination through skewed views.
As noted, Facebook banned Proud Boys back in 2018 after designating them as a hate group. More recently, Facebook has taken further action against Proud Boys linked groups, and other right-wing organizations, in response to content and activity around the
#BlackLivesMatter protests. That, in some ways, represents a more proactive approach to hate speech by The Social Network - but Facebook has also come under fire for allowing controversial comments from US President Donald Trump to remain up on its platform, which many say is still facilitating hate speech.
Facebook is yet to shift its stance on such, but the pressure continues to mount. Currently, Facebook is in the midst of an advertiser boycott, which will cost it millions, or more, over the course of 2020, while this week, a Facebook-commissioned civil rights audit savaged the company's handling of Trump's comments.
Given the opposition to its approach, Facebook may still decide to take more action on hate speech, in all forms, and from all users - and as this latest finding of a comparatively small cluster of activist accounts shows, any action at all on this front is important.
The fact is that Facebook's distribution system favors extremist groups, as their messaging is more incendiary, more biased, and more likely to spark debate and argument. Facebook would term such 'engagement', and its entire eco-system is built around fueling that activity.
Given this, every step that Facebook can take to address this, due to its massive scale and reach, is significant in the battle against divisive groups. As the community responses to the COVID-19 pandemic, and official directives around such, have shown, this is about more than facts and reason, the movements being formed are guided by political idealism, beyond logical stances, and, in large part, misinformation which construes the facts to support certain agendas.
Would such opposition gain so much traction if posts like this weren't getting millions of clicks on Facebook?
Now consider this - would the same information gain such traction if the President of the United States, for example, had opted to take a more definitive stance on mask use?
And then, what role is Facebook, which has 1.7 billion daily active users, playing in amplifying such commentary?
Even on a smaller scale, the impact can be massive, and Facebook, whether it likes it or not, needs to assess its part in the distribution chain that fuels such movements.