Culture

Facebook is blocking ads that would help content moderators sue the company

Ads meant to raise awareness about the psychological toll of content moderation are curiously banned.

Very tired and exhausted woman, holds her head with hands sitting at the laptop  late in the night. ...
Shutterstock

Facebook has been banning advertisements from a law firm, Coleman Legal, that seeks to help moderators win compensation and representation for their often traumatic work, according to Vice.

The outlet reports that Facebook has been blocking ads that provide users with more information on how to receive legal help if they are content moderators. These ads would target moderators who sift through Facebook content and are employed by third-party companies, linking to a content moderator compensation claim form on the law firm's website.

It isn't uniform blocking, to be clear. As Vice notes, one ad ran while the other was blocked. Both had the same following text:

Over the last few years, there has been an unprecedented increase in Content Moderators being diagnosed with PTSD. This significant rise has been imputed by daily exposure of disturbing content, bestiality, drug abuse, animal cruelty, violence, and more which is shockingly enough, a crucial part of their job requirement. Get all concerns addressed with an experienced solicitor by filling the Free Case Evaluation Form. Coleman Legal has a dedicated team of experienced solicitors who can offer help, advice, and tailored guidance. Compensation for this claim may be the first step on your road to recovery.

Inhumane conditions — Facebook has been scrutinized over the years for its treatment of moderators and its handling of the psychological toll they face from viewing exceptionally horrific content.

Warning: the following information regarding one of Facebook's content moderators, only referred to as "Mr. A" here, is graphic and can potentially disturb the reader.

According to the law firm, Mr. A filed a lawsuit detailing his harrowing ordeal as a content moderator. He claimed that he was repeatedly required to look at videos depicting sexual and physical abuse against children as well as "a compilation of clips showing people dying by suicide, set to music; a collection of hundreds of photos depicting people self-harming; a video of a man being beaten to death with planks of wood; videos and images of beheadings; videos showing people being electrocuted and impaled and videos showing individuals being stabbed in the stomach."

Eventually, after the first lawsuit filed by the law firm, Mark Zuckerberg's company agreed to pay a total of $52 million to current and former moderators. But the objective to empower these employees clearly remains incomplete as Facebook is still reportedly overworking its human content moderation taskforce. Blocking ads created to help moderators find relief only makes Facebook look worse.