Skip to main content

Facebook to remove antivaxx posts

Facebook
(Image credit: Shutterstock)

Facebook has gone from 0 to 100 when it comes to its content moderation. In addition to flagging false election and COVID-19 information, the platform will now be removing false claims about the upcoming coronavirus vaccines. 

Facebook released their statement as leading vaccines are coming closer to their final stages and approval for release. 

These vaccines are key in the fight against the virus as vague hopes around herd immunity development have proven to be ineffective and multiple waves of the virus sweep across the globe. 

The company has outlined that their new rules will include false claims regarding the “safety, efficacy, ingredients or side effects of the vaccines. For example, we will remove false claims that COVID-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list." 

This is a bold move as the antivaxx (anti-vaccination) community has a strong presence on the social media site with many groups dedicated to it. 

Facebook said it will update the rules as new issues around the vaccine arise. This is a smart move as the antivaxx movement has impressive traction and PR, making it a shape-shifter when it comes to arguments regarding it's false position. 

How effective this enforcement will be is still to be seen, the site has had issues in the past with content management, letting false or harmful information slip through. 

However, Facebook has teamed up with fact-checking organisations worldwide to help in this fight. In South Africa and Africa, the company has partnered with Africa Check, the leading fact-checking site in the region. 

Why false information needs to be blocked

The issue with misinformation regarding the COVID-19 vaccine is that protection against the virus will only occur if everyone who is able is given the FDA approved shot. 

This is how herd immunity is created. Since there are some who can't take the vaccine for medical reasons, others who need to in order to keep those who can't safe and eliminate the virus in the human population completely. 

In the past, viruses have been removed entirely through vaccination. Smallpox has been completely eliminated since the vaccine with the last known case happening in 1977 in Somalia. 

Moderation and Free Speech 

Those in the antivaxx community and others like it often call this kind of content moderation a stifling of free speech. In the US this is even more prominent at the country doesn't have laws that define exactly what constitutes freedom of speech. 

However, Facebook and other sites have noted the importance of stopping the spread of false information as a point of safety which is considered to trump free speech. 

During the pandemic, there has been a marked increase in conspiracy theories and misinformation on sites ranging from antivaxx theories to QAnon expansion. This has pushed sites to be more hands on when it comes to content being shared. 

Effectiveness of content moderation 

A new concern is whether content moderation and warnings are even effective. 

According to facebook, despite placing warnings over coronavirus misinformation by Donald Trump, did little to stop its spread according to internal data from the company. 

This creates a new concern about how far sites have to go to stop these theories in their tracks. If putting a "misinformation" sign is not enough to stop people, is completely removing it the answer? 

Leila Stein

Leila Stein is an experienced multimedia journalist and content producer with a special interest in data journalism. she is skilled in news writing, editing, online writing and multimedia content production and have a Bachelor of Journalism  from Rhodes University and an Honours in Historical Studies from University of Cape Town.