Science

Facebook Pulls Australian Child Abuse Video

It's hard to walk the line between 'careful' and 'censorious.' 

Getty Images / Justin Sullivan

Facebook’s efforts to become the future of news are clashing with the rules that have allowed it to be the social network of choice since it supplanted MySpace.

Nowhere is this conflict more evident than in the recent decision to pull an ABC video showing child abuse in Australian youth detention facilities. The video is clearly in the public’s interest — people typically want to know when prisoners are being mistreated, especially if the abuse specifically targets young children. Yet it was removed from Facebook’s site because it depicted an 11-year-old boy’s naked butt, thus running afoul of the site’s rules about showing child nudity.

Few would argue that Facebook shouldn’t have such a rule. But there is a clear difference between child porn depicting a young boy’s naked body in a sexual way and a video showing that an 11-year-old was stripped naked by prison workers in his cell, which was just one in a series of incidents in which the boy was attacked by guards in the youth detention center shown in the video.

Facebook told Mashable that it won’t restore the video. “The second video does contain child nudity and so we cannot restore it,” a spokesperson said. “Our Community Standards do not allow any nudity of minors to be shared on our services, even if they are shared with the purpose of condemning it. We review millions of reports each week and from time to time we make a mistake and work to rectify this where we become aware of this.” Yet the company doesn’t consider this a mistake.

Another recent example is Facebook blocking links to the WikiLeaks data dump of Democratic National Convention emails. The company said the issue was caused a now-resolved mistake in its anti-spam filter. (WikiLeaks has not threatened to create its own Facebook in response to the problem — maybe it’s too busy making its Twitter clone at the moment.)

In both cases, Facebook promptly responded to the problems. It did the same when it temporarily removed a video showing the shooting of Philando Castile on July 7. That video was later restored, and Facebook Live quickly became the go-to news source for updates on Castile’s case and the Dallas police shooting.

Every time Facebook censors news stories like this— even if it’s in error and the problem is later fixed — it undermines its desire to become the future of news. Media outlets telling important stories don’t need to hurdle yet another obstacle in their path to letting the public know what’s happening in the world around them. Yet these rules are unlikely to change; Facebook depends too much on its community guidelines to ensure that its users don’t scare people off the site.

So what are news organizations to do? Well, in ABC’s case, it edited the video to remove the nudity. You can view the updated — but still graphic — video here:

Others might not have the luxury of changing their story (or at least how their story is told) to comply with Facebook’s rules. It’s time for the site to choose: Does it want to hold news organizations to the same standards as its normal users, or does it want to give the press the freedom to tell uncomfortable truths without fear of having their stories removed from the world’s most popular news service?