Facebook's Plan to Stop Misleading Health Claims Makes a Crucial Omission

One key area isn't mentioned in the announcement. 

Unsplash / Hyttalo Souza

Facebook recently announced further actions to curb the spread of sensational health claims and misinformation on the social media platform.

Specifically, the company made two ranking updates to reduce the reach of “posts with exaggerated or sensational health claims” and “posts attempting to sell products or services based on health-related claims,” according to the company’s blog post. It’s a worthy change, and one that will hopefully help limit the spread of anti-vaccine or other propaganda that poses real threat to public health.

Unfortunately, these changes, as announced, ignore some of the most fertile ground for misinformation: Facebook Groups, which doesn’t appear in the blog post at all (we reached out to Facebook for comment about the role of groups in this crackdown and will update if and when we hear back).

Facebook Groups are the face of the company’s pivot to privacy. On April 30, 2019, Facebook announced a bigger push into groups, ostensibly to foster more intimate and private connections. It even said it would create specialized feeds so people could see what was happening in each group they were a part of. It’s not clear, however, what measures these re-vamped groups will take to stem the spread of misinformation.

Does Social Media Have a Misinformation Problem?

Facebook, and of course other social media companies like Youtube, have been long been hotbeds of activity for anti-vaxxers, extremists, and people who disseminate racist content.

Indeed, half of all parents with small children were shown misinformation about vaccines on social media, according to a recent study from the Royal Society for Public Health.

Facebook's blog post addressing sensational health claims. 


Facebook has also been called a “key disseminator” in anti-vaxx information.

Facebook has incrementally introduced features to try and counter misinformation on the platform, including penalizing pages that share false news, making them appear less readily in news feeds, a change it introduced on April 10. It does not ban pages, but does make sure they don’t come up in recommendations and makes them generally harder to find.

Facebook’s New Health Announcement Leaves Out Their Groups Feature and That’s a Problem

A Wall Street Journal investigation published Tuesday found that Facebook and YouTube are flooded with posts promoting alternative cancer therapies which are unproven and can be dangerous. Facebook published its post on sensational health claims the same day.

But the post failed to mention action being taken in groups, particularly closed groups, where less savory activities are undertaken. That’s a problem, because the content shared in closed groups is less likely to come into contact with someone who may report it.

A sampling of top group results when searching "health" and "vaccine" on Facebook. 


But it’s not just secret groups that spread misinformation. Searching “health” and “vaccination” in Facebook groups will bring up a list of groups, including some like “Vaccine Injury Stories” (which is public) and “Tongue Ties, Autism, MTHFR, Vaccines, Leaky Gut - What’s the Connection?” (which is private).

Together they have over 45,000 members, and circulate posts like “pediatrician lies about vaccine side effects” and “Vitamin K shots cause an ear infection” so don’t get the shot because “if a baby needed vitamin k they would have it.” The header of another post reads “DTaP Vaccine Causes Brain Damage in Ten Month Old After 12 Hours.”

Including these two groups, three of the top six groups shown were groups questioning the legitimacy of vaccines.

It’s good to see Facebook take steps to try to counter the vast array of misinformation on the site. But failing to address groups, particularly public ones with thousands of members, when it comes to false or sensational health claims, is a key oversight.

Related Tags