Amongst Facebook’s extensive lineup of public scandals over the last few years, its inability to control the spread of misinformation across the platform is by far its most consistent. That’s why it adds new “tools” for fighting falsities every few months; the latest is a set meant to help Group admins get a grip on members’ misinformation campaigns.
Some of these new tools are automated, allowing Group admins — who are, for the most part, volunteers — to take a more hands-off approach to monitoring their communities for misinformation. Others are meant to extend the control already given to admins.
While Meta’s content moderation does run through Groups, Facebook’s approach to safety relies heavily on admins. Facebook builds tools to help admins with daily moderation, but there’s always the chance misinformation will fall through the cracks, especially in Groups with very active communities. Adding a few more moderation tools should ostensibly help them out.
There’s a larger problem here, though, and it’s one Facebook can’t really change: Its Group tools will only ever be as effective as the admins using them.
Finally some help — Administrator tools have been enough to cover the majority of Group-moderation functions, but they also haven’t really been updated in a while. Facebook, meanwhile, has continued to evolve; these new tools have been a long time coming.
First and foremost is a new feature allowing admins to let Admin Assist automatically decline new posts that have been flagged for misinformation. This could significantly reduce both the admin’s workload and the Group’s misinformation content.
Admins will also now have the option to suspend Group members, which will temporarily ban them from posting, commenting, and reacting. This should, if used well, help control repeat offenders of Facebook’s policies.
The Admin Assist feature can also automatically approve or decline new member requests now based on specific criteria; this should lighten the admin’s workload somewhat. Similarly, the redesigned Admin Home includes sections that make it easier to figure out what requires attention.
Is this futile? — Expanded Group moderation tools are a very welcome addition to admins’ existing arsenal. The larger problem of moderating Facebook’s 600 million or so Groups is just as untenable as ever, though.
For these tools to be effective, admins will have to actually use them. This is a problem. Group admins are not vetted by Facebook; these admins could be anyone. What if, for example, a Group admin doesn’t believe Facebook’s fact-checking labels are accurate? There’s no way they’ll choose the new auto-moderation options. Those who do enable the feature are likely the same admins already keeping an eye out for misinformation.
Group moderation can only ever be as wise as its admins. These new tools won’t change that.
Correction: Facebook’s moderation tools do run through Groups; an earlier version of this article stated Groups are not touched by Facebook moderation at all. We regret this error.