Despite claiming that censoring content such as dangerous, malicious, political propaganda, bald-faced lies, or conspiracy theories and other misinformation would be antithetical to its role as an enabler of free speech, Facebook is willing to make some exceptions to its own rules. Anything that could expose the social media company to legal or regulatory content, regardless of whether or not that content is in any way illegal, could soon be removed from the platform, at its discretion.
Technically right, if morally wrong — This week, Facebook issued a statement to users explaining that it’s totally within its rights to restrict, take down, or otherwise block content it deems potentially harmful to its overall goals.
“We also can remove or restrict access to your content, services or information if we determine that doing so is reasonably necessary to avoid or mitigate adverse legal or regulatory impacts to Facebook,” reads a portion of the statement sent out to users.
Pity Facebook didn't grow a backbone sooner — And the company is totally right on that score, at least here in the states. Facebook is a private entity. It can, for the most part, choose at its own discretion what is displayed on its platform. It’s just that this also happens to be a complete reversal of Facebook CEO Mark Zuckerberg’s practically since practically the beginning of his rise to Bond villain status.
Despite repeated pleas from both users and government regulators, Facebook is still rife with COVID-19 misinformation, political bile, QAnon screeds, and direct, organizational calls to violence, often using the umbrella term of “free speech” as a reason for delayed action and half-measures. For Facebook to only limit content when it could affect its business is as despicable as it is on-brand.
The impact for users is unclear— The changes in policy, set to go into effect on October 1, are seen as a direct response to Facebook’s ongoing spat with Australian authorities over a proposed law that would require the social media corporation to pay media entities for sharing their articles on the social network (Google is facing the same challenge). Facebook says its new regulation criteria will affect users globally, providing a thin veil of cover for it to pull content in response to its standoff with Australia.
It remains to be seen just how the company will enforce the new measures, and how strictly it will do so. The phrasing in Tuesday’s statement is so broad that it’s difficult to envision how much content would fall under the category of “legal and regulatory impacts to Facebook.”
In any case, it comes as no surprise that the company has thought out threats to its business model and “neutralizing” competition far more than it’s given consideration to the possibility of Trump contesting the Presidential election results if he loses. Because Facebook may be many things, but concerned with the greater ethical implications of its enormous reach isn't one of them.