You are here

Health misinformation viewed 3.8 billion times on Facebook in the past year


Health misinformation is still a major problem at Facebook, despite efforts to tackle it, according to a new report.

The study, conducted by global activist network Avaaz, found that health misinformation was viewed 3.8 billion times on Facebook in the past year, peaking at 460 million views in April.

The findings follow a congressional hearing in July which saw major tech bosses – including Facebook’s Mark Zuckerberg – accused of censoring political speech and spreading fake news through their leadership. 

Facebook said the results of the study did “not reflect the steps we've taken”. 

The company’s attempts to curb health misinformation have escalated in recent months and now include fact-checked articles in its COVID-19 Information Center plus signposts to the World Health Organisation (WHO) if you interact with fake news.

But Facebook’s algorithm is still under fire, with the report suggesting that only 16 per cent of health misinformation carried a warning label.

Avaaz campaign director Fadi Quran said: “Facebook's algorithm is a major threat to public health.

“Mark Zuckerberg promised to provide reliable information during the pandemic.

“But his algorithm is sabotaging those efforts by driving many of Facebook's 2.7 billion users to health-misinformation-spreading networks.”

The report established that the top ten websites spreading health misinformation had four times as many views on Facebook as information from official sites, such as the WHO.

Quaren added: “This info-demic will make the pandemic worse unless Facebook detoxifies its algorithm and provides corrections to everyone exposed to these viral lies.”

Among the “bogus cures” revealed in the study was a claim that the past use of colloidal silver to treat syphilis, tuberculosis and ebola is a safe alternative to antibiotics. The article in question was viewed 4.5 million times.

Researchers called on Facebook to take steps to provide all users who have seen misinformation with independently fact-checked corrections, and to “downgrade” misinformation posts on users’ News Feeds. 

They said Facebook has yet to “effectively apply these solutions” on a scale sophisticated enough to defeat the infodemic.

In a statement Facebook said: “We share Avaaz's goal of limiting misinformation. Thanks to our global network of fact-checkers, from April to June, we applied warning labels to 98 million pieces of COVID-19 misinformation and removed seven million pieces of content that could lead to imminent harm.”

Facebook isn’t the only tech company struggling to keep health misinformation under control. In May, NewsGuard, a social media analytics firm, found Twitter accounts with large followings posting coronavirus misinformation, despite a COVID-19 continuity strategy announced by the firm only a month before.

Parent Zone has plenty of resources to help children learn how to spot fake news, including our guide to thinking critically and understanding digital resilience.

For reliable facts about COVID-19 visit Full Fact, the World Health Organisation or Centers for Disease Control and Prevention.

Image: agcreativelab/


What should kids know about internet safety?

The 6 apps and services that every parent should know about