top of page
Writer's pictureRoundabout team

YouTube restricts videos about COVID-19/5G conspiracy theory

YouTube has confirmed that it will reduce the recommendation and distribution of videos which promote conspiracy theories that link the spread of COVID-19 to 5G technology.


This comes after a spate of attacks on cell phone towers in some regions - according to The Guardian in the UK, signal towers in Birmingham, Merseyside and Belfast were set on fire in the last week, in attacks that have been linked to the rising theory. Mobile carrier workers have also reportedly been subjected to abuse due to concerns about 5Gs role in the pandemic. 


The circulating rumor suggests that 5G signals exacerbate the spread of the virus. The core concept relates to the use of 5G in Wuhan, where COVID-19 originated from. Incidentally, so the theory goes, Wuhan is also China's first region to get full 5G coverage, which was significantly ramped up in October last year, ahead of the outbreak. Scientists have debunked the idea, noting that many regions of China have 5G coverage, in addition to Wuhan, while COVID-19 is also spreading fast in many regions that don't yet have 5G infrastructure. Yet, the theory has been gaining momentum, with even celebrities like actor Woody Harrelson re-sharing the concept. 


YouTube says that it will remove any content which violates its regulations, while it will also significantly reduce the reach of any 'borderline' videos, which push conspiracy theories but don't cross the line.


The controversy is the latest of various content headaches of this type for the platform in recent times - if you're looking for conspiracy theories and internet rabbit holes to tumble down, YouTube is likely where you'll eventually end up. 


The online video giant has become known for hosting left-of-center content, while its algorithmic recommendations can drag people further in, reiterating such ideas by showing you more, similar content.


Indeed, last year, The New York Times profiled a 26 year-old man who had been 'radicalized' by YouTube content, highlighting concerns with the platform's 'Up Next' prompts which, he says, had lured him deeper and deeper into violent, extremist views.


YouTube has been working to address this. Last January, YouTube announced that it would limit recommendations of content which came close to violating its Community Guidelines, but didn't quite cross the line. The examples YouTube provided in that instance were videos relating to miracle cures and conspiracy theories, including those about 9/11 and 'flat earthers'.

YouTube further outlined improvements to its recommendations algorithm to reduce such impacts in June, while at the same time, it also ran a test which would see all video comments on the platform hidden by default, with users needing to tap a button to view any related discussion.


That sought to address another element of concern, relating to predatory behavior on the platform, but the changes overall show that the problems of indoctrination and radicalization are a serious concern - and with more than2 billion monthly active users, YouTube's influence in this regard can be significant.


39 views0 comments

Comments


Discover Roundabout's free reporting tool for every social media campaign

Download the app

Apple and Android

apple_google_edited.png
apple_google_edited.png
bottom of page