Culture

Facebook won't promote health groups, but won't ban bad ones either

The social network is trying to limit the reach of bad health advice but is putting the onus on group administrators to do it.

A group of three in separate bubbles can be seen on their smartphones. A hand is holding a smartphon...
Shutterstock

Facebook has an unsettling misinformation problem. Analysts and fact-checkers have repeatedly condemned the platform's ability to allow hoaxes, conspiracy theories, pseudoscience, and more to thrive without consequences. In efforts to mitigate that issue, Facebook has announced that it will get tougher on one specific element of its own network: promotions. The social network says it will not recommend or promote health groups on its platform anymore.

The decision comes after multiple groups and ads peddled false claims about curing the coronavirus as well as cancer and autism (which doesn't need "curing"), alongside groups that spread medically false statements about vaccinations.

According to Facebook it also took down 1.5 million "pieces of content" — presumably videos, photos, infographics, posts, and the like — from groups that were found to be promoting organized hate in 2019. The company also removed 12 million pieces of content from groups violating its hate speech policy, which already garnered criticism during the fatal Kenosha shootings this year. Removing millions of pieces of digital detritus is great, but how about instituting and enforcing policies that disallow it in the first place?

Health groups get the boot — According to the company, Facebook groups do have the potential to encourage community organizing, dialogue, and understanding... but they also can be sources of misinformation.

"It's crucial that people get their health information from authoritative sources. To prioritize connecting people with accurate health information, we are starting to no longer show health groups in recommendations," the company said. "People can still invite friends to health groups or search for them."

More rules for groups — If a group is found violating network policies around speech and content, Facebook will temporarily disable group members and administrators from forming other groups, which could have a positive effect on reducing the mushrooming of more pages. Additionally, those groups lacking administrators will ultimately be archived.

More moderation — On a more individual level, Facebook intends to bolster its approval timeline and mechanisms by targeting users who have violated community standards. If a user has violated a community guideline or regulation within a group, they will have to get administrator approval for posting content for the following 30 days. This is supposed to slow down repeated offenses and involve human moderation to a more detailed degree.

There's no guarantee that these new rules will magically cure Facebook's status as a contagious vector for misinformation and bogus theories. And it's a classic Facebook move to put much of the onus of moderation on group administrators rather than taking on the responsibility itself. But a little effort in the right direction, especially when it comes to matters of health, could help slow down the spread of false claims. And we guess that's something.