The Future of Porn

Pornhub changes core policies in response to child exploitation allegations

A ‘New York Times Opinion’ piece has shaken up how the company does business.

Shutterstock

After several days of colossally bad press, Pornhub’s parent company MindGeek has changed the way its stable of pornography websites operate. Users must now be verified in order to upload videos and downloads are banned. These changes, along with other safeguards, aim to curb the presence of illegal content on its platforms.

Though the update appears to be several months in the making, it comes on the heels of New York Times Opinion columnist Nicholas Kristof’s harrowing article detailing how Pornhub, in particular, monetizes child exploitation, rape, and trafficking.

What’s changing? — In addition to Pornhub, MindGeek operates several popular pornography websites including RedTube and YouPorn. Pornhub’s announcement outlines seven ways it will now (or has already started to) combat the spread of videos portraying illegal acts.

Now, only those in Pornhub’s Model Program can upload videos, but it will implement a verification process for others next year. Downloads are banned unless the video is part of the paid Model Program. These videos will be “fingerprinted” using the platform’s existing blend of moderation technologies:

CSAI Match, YouTube’s proprietary technology for combating Child Sexual Abuse Imagery online
Content Safety API, Google's artificial intelligence tool that helps detect illegal imageryPhotoDNA,
Microsoft’s technology that aids in finding and removing known images of child exploitation
Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform.

A Motherboard investigation from February found that fingerprinting can be circumvented through editing. To catch what the tech and Pornhub’s existing human moderators can’t, the company added another layer of moderation in recent months called the “Red Team.” This team specifically focuses on illegal content, and combined with the upload/download restrictions, could actually clean up the site and provide closure for victims.

Pornhub also highlighted its Trusted Flagger Program with 40 international non-profit organizations that can report videos and its partnership with the National Center for Missing & Exploited Children (NCMEC). The NCMEC and Pornhub will release separate reports on the company’s moderation efficacy next year, and a law firm retained by the company in April will specifically review its compliance with legal standards.

Something had to give — Pornhub is essentially the YouTube of porn (or maybe it’s the other way around). Prior to today, anyone could upload videos, and then moderators play whack-a-mole with offending or illegal content. This aspect, combined with the ability to download any video, allows videos of non-consensual acts to proliferate, leaving victims with little to no recourse.

Pornhub’s public face and actual practices have long been at odds, with racist content juxtaposed against promotional gimmicks and an unslain beast of disturbing, non-consensual videos forever lurking in the shadows. It would be nice to think that Canadian protests (MindGeek operates out of Canada) or the visceral crafting of victims' courageous stories in Kristof’s article forced the company’s hand on its upload/download policies, but only money will slay the beast.

Pornhub wants more sophisticated advertisers, and Nike definitely doesn’t want to run ads next to rape videos. Not to mention Visa and Mastercard immediately launched investigations into the NYT article’s allegations, threatening to join PayPal in ceasing payment processing for the platforms if they were substantiated.

Despite self-interested motives, the policy change is a welcome step in the right direction. Now, we just have to crack the top porn platform, XVideos.