Just 12 People Are Behind Most Anti-Vax Disinformation On Social Media
A newly published report from the Center for Countering Digital Hate (CCDH) and Anti-Vax Watch has discovered that just 12 individuals are behind the majority of anti-vax disinformation spread on social media.
This report, which analysed content posted or shared via social media more than 812,000 times between February 1 and March 16 2021, found that 65% of anti-vax content could be traced back to the so-called ‘Disinformation Dozen’.
Furthermore, 73% of all anti-vax content posted or shared on Facebook over the past two months could be traced back to these 12 individuals. Most of the disinformation shared by the Disinformation Dozen remains on mainstream social media platforms, even after repeated terms of service violations.
According to this report, alternative medicine promoter Joseph Mercola – who was recently handed a Food and Drug Administration (FDA) warning in regards to his bogus coronavirus treatments – is the largest anti-vax influencer.
The late John F. Kennedy’s nephew Robert F. Kennedy, Jr. was also found to be one of the biggest influencers in the spreading of anti-vax content.
Kennedy was recently banned from Instagram for being in violation of the platform’s coronavirus vaccine misinformation policy. However, although there have been calls to deplatform him from Twitter and Facebook, his accounts remain active on both sites.
The other anti-vaxxers listed in the Disinformation Dozen include Ty and Charlene Bollinger, Sherri Tenpenny, Rizza Islam, Rashid Buttar, Erin Elizabeth, Sayer Ji, Kelly Brogan, Christiane Northrup, Ben Tapper and Kevin Jenkins.
Research carried out last year by CCDH found that platforms fail to act on 95% of coronavirus- and vaccine-related misinformation that is reported to them, while CCDH’s recent report, Malgorithm, found evidence that ‘Instagram’s algorithm actively recommends similar misinformation’.
CCDH CEO Imran Ahmed said:
Facebook, Google and Twitter have put policies into place to prevent the spread of vaccine misinformation; yet to date, all have failed to satisfactorily enforce those policies.
All have been particularly ineffective at removing harmful and dangerous misinformation about coronavirus vaccines, though the scale of misinformation on Facebook, and thus the impact of their failure, is larger.
Going forward, this report has advised that deplatforming repeat misinformation offenders is the most effective way of putting a stop to ‘the proliferation of dangerous misinformation’.
This deplatforming should also include organisations that are controlled or funded by these individuals, as well as any backup accounts established as a means of evading complete removal from sites.
If you have a story you want to tell, send it to UNILAD via [email protected]
Most Read StoriesMost Read
CreditsCenter for Countering Digital Hate (CCDH)
Center for Countering Digital Hate (CCDH)