top of page
  • MaryGrace Lerin

Meta Shares New Information About Its Efforts to Detect Coordinated Manipulation Programs

Meta has released some new details about its unrelenting efforts to counter coordinated misinformation networks operating across its platforms, which became a significant concern for the company after the 2016 US election and disclosures that Russian-backed teams attempted to influence American voters' opinions.


According to Meta:


“Since 2017, we’ve reported on over 150 influence operations with details on each network takedown so that people know about the threats we see - whether they come from nation states, commercial firms or unattributed groups. Information sharing enabled our teams, investigative journalists, government officials and industry peers to better understand and expose internet-wide security risks, including ahead of critical elections.”


Meta provides a monthly round-up of the networks it has identified and eliminated using automated, user-reported, and other collaborative methods, broadening its net in the process.


And, over time, some intriguing trends have formed in Meta's enforcement data — for starters, Meta has released this summary of the origins of the groups it has recognized and taken action against.


As seen here, while various organizations have been discovered within Russia's borders, there has also been a concentration of activity coming from Iran and the neighboring territories, and Meta has just taken action against some of these groups operating in Mexico.

But Meta's data on the places that these groups have been targeting is even more intriguing, with a clear move away from foreign involvement and toward domestic misinformation campaigns.


As indicated in these graphs, there has been a major shift away from worldwide pushes, with localized operations becoming more common, at least according to what Meta's teams have detected.


This brings us to the other side of the research: those seeking to use Meta's platforms for such purposes are continually changing their tactics to avoid detection, and it's possible that more groups are still operating beyond Meta's scope, so this may not be a complete picture of misinformation campaign trends.


However, Meta has been stepping up its game, and it appears to be paying off, with more organized misinformation campaigns being exposed and greater action being taken to hold criminals liable to deter similar schemes in the future.


But, in reality, it will continue to happen. Facebook has around 3 billion users, Instagram has over a billion users (reportedly over 2 billion, though Meta has not confirmed this), and that's even before you include WhatsApp, which has over 2 billion users. At such a large scale, each of these platforms creates a significant possibility for amplification of politically driven messaging, and while malicious people can make use of this potential, they will keep looking for methods to do so.


Which is a consequence of running such popular networks, and one that Meta has either disregarded or refused to see for a long time. Most social networks were created on the premise of linking the world and bringing people together, and that fundamental ethos is what drives all of their developments and processes, all to improve society through enhanced global community understanding.

That's a worthy endeavor, but social platforms also allow individuals with ulterior motives to connect and develop their networks, as well as spread their potentially harmful messaging across those same networks.


The conflict between idealism and reality has long perplexed social platform CEOs, who, once again, choose to view the potential good over all else. Crypto networks today find themselves in a similar situation, with enormous potential to connect the world and bring people together, but also the potential to assist money laundering, large-scale scams, tax evasion, and worse.


It's tough to strike the appropriate balance, but as we've learned from past mistakes, the consequences of failing to notice these flaws can be devastating.

That's why these efforts are so crucial, and it's intriguing to see both Meta's teams' increased push and bad actors' techniques evolving.


My opinion? Localized groups have tried to apply the same strategies on a local level after seeing how Russian groups attempted to influence the US election, implying that prior enforcement has unwittingly exposed how Meta's platforms may be used for such purposes.

That's likely to remain the case in the future, and Meta's improving measures should help to effectively detect and remove these activities before they have a chance to take effect.


Meta's Coordinated Misinformation Report for December 2021 may be found here.

25 views0 comments

Comments


Discover Roundabout's free reporting tool for every social media campaign

Download the app

Apple and Android

apple_google_edited.png
apple_google_edited.png
bottom of page