Connect with us

Technology

Facebook is full of anti-vaxxers spreading fake information

Published

on

What should Facebook's role be in countering anti-vaccination conspiracy theories?
What should Facebook’s role be in countering anti-vaccination conspiracy theories?

Image: BSIP/UIG via Getty Images

In the case of anti-vaxxer activity on Facebook, “going viral” can be deadly.

A new investigation from The Guardian has uncovered how Facebook is a hotbed of anti-vaxxer conspiracy theories. It went inside private Facebook groups with hundreds of thousands of members to see how anti-vaxxers peddling fake remedies to manipulate worried parents and spread conspiracy theories. And so far, Facebook isn’t doing a damn thing about it.

Fake anti-vaccine science has resulted in the re-emergence of previously eliminated deadly diseases and subsequent deaths of children. Messaging supporting those fraudulent claims, with no scientific counter-information, flourishes in private and secret groups with thousands of members. 

Facebook has also reportedly accepted “thousands” in advertising money from groups such as Vax Truther, Anti-Vaxxer, Vaccines Revealed, and others. It did not provide comment for the Guardian’s report.

Facebook has recently focused a major operation at countering misinformation on its platform. However, the campaign has primarily been targeting fake news around elections, politics, and the inflammatory actions of foreign operatives meant to sow discord on issues like immigration and race. 

But the Guardian’s report shows that misinformation around health on the platform could be just as dire. And Facebook has not publicly focused its initiatives on this aspect of fake information.

The anti-vaccination conspiracy theory that vaccinating children can cause autism came to the fore in the early 2000s after a discredited doctor published a fraudulent and false study. The issue’s championing by some celebrities, and well-organized groups on social media, caused the conspiracy theory to gain traction — and an increase in non-medical exemptions for mandatory vaccinations. As a result, there has been a 30 percent increase in the previously eradicated disease, measles. And the World Health Organization called “vaccine hesitancy” one of the top ten global threats to world health in 2019. 

What should Facebook’s role be in all of this? 

Doctors tell the Guardian that Facebook should have the same standards for health information that pharmaceutical companies and advertisers do. At the very least, Facebook needs to do more to shut down these harmful groups, or remove misinformation, doctors and advocates say. In this case, fake news is — once again — deadly.

But as Facebook’s fight against general misinformation has shown, policing harmful content is an issue with a thornier solution than simply banning accounts that spread fake news. Facebook has flagged, labeled, and down-ranked misinformation; it has banned fake accounts; it has partnered with fact checking organizations. But Facebook acknowledges that the fight against fake news is a game of cat and mouse — because it can’t stop individuals from sharing false information, a clear, proactive solution remains elusive.

With the lives of children at stake, should terms of service and free expression be damned? Or should Facebook maintain its stance that it is a “platform for all ideas” — even when they result in, well, death and disease? 

That slogan doesn’t sound so rosy when you put it that way.

Uploads%252fvideo uploaders%252fdistribution thumb%252fimage%252f90398%252f7e3fcb14 680c 466d 953b 67b602307835.jpg%252foriginal.jpg?signature=trnpe4lfplz8yzry0v0v7r0pa o=&source=https%3a%2f%2fblueprint api production.s3.amazonaws

Continue Reading
Advertisement Find your dream job

Trending