In what appears to be a violation of Facebook's Dangerous Individuals and Organizations policy, multiple groups associated with the pro-civil-war Boogaloo movement are getting the boot by Mark Zuckerberg's company. On Tuesday, the social network announced its decision to actively remove content related to the ideology. "This is the latest step in our commitment to ban people who proclaim a violent mission from using our platform," the company claimed.
The anti-government movement promoted by far-right extremists, which recently had its server removed from Discord, calls for violence against law enforcement authorities and civilians, and seeks to stoke division among various communities in the United States. "Members of this network seek to recruit others within the broader boogaloo movement," the company announced, "sharing the same content online and adopting the same offline appearance as others in the movement to do so."
Restriction and removal — Facebook described the group as carrying anti-Semitic, racist, and incendiary content across not only the main social platform but also Instagram. In order to limit its poisonous spread, the company has announced the following steps:
- The company removed 220 Facebook accounts affiliated with the Boogaloo movement.
- On Instagram, it removed 95 accounts associated with the ideology.
- On Facebook, 28 pages and 106 groups got the boot.
- On top of that, pages that appeared to cosign such ideas were also removed. Specifically, per Facebook, 400 additional groups and more than 100 such pages were taken off the platform.
- Facebook adds that over the past two months, it has moderated and ultimately removed more than 800 posts that violated its violence and incendiary content guidelines.
Should Facebook expect backlash? — As it is with any kind of content removal from a major social media network, you can expect those on the receiving end of these policy decisions to criticize and even claim victim status as we've previously witnessed with those who think tech has a conservative bias.
Some will view Facebook's decision as unfair and uncalled-for, and the company is to be aware of this potential backlash, noting that the removals were taken out against pages, accounts, and posts that posed, in its view, "the greatest risk of real harm." Whether or not this has a positive effect on mitigating violent rhetoric on the platform — where Zuckerberg has allowed incredibly violent and racist content to run unbothered with the occasional removal after public outcry — remains to be seen.