Culture

Pornhub's first transparency report shows its moderation is working

Horizontal color image of a sex shop with various products on the walls in London, United Kingdom. T...

653,465

Number of videos removed in 2020 for violating Pornhub's policies.

Pornhub

funky-data/iStock Unreleased/Getty Images

Pornhub released its first-ever transparency report this weekend, with tons of information about the company’s moderation tactics throughout 2020. Those efforts have paid off in spades: approximately 653,465 pieces of content were identified as potentially harmful and removed from the platform last year. This includes content depicting minors, animal harm, hate speech, and non-consensual content.

The report is fairly exhaustive, complete with moderation diagrams for each potential kind of content violation. Pornhub uses a mix of human and AI moderators to speed up its content verification process, with dedicated teams reviewing reports on a 24/7 basis.

The reason Pornhub is finally releasing this information can be linked directly to the company’s December 2020 privacy and safety overhaul, in which millions of unverified videos were nuked from the site overnight. The company says this is only the beginning of its commitment to user safety — there’s more to come, including cutting-edge technology and a fuller Trust and Safety Center.

Moderation from all angles — For the most part, Pornhub is removing videos that go against its content policies — more than 650K videos fell into this category last year. These videos can be flagged for removal through a variety of methods, including user flagging and automated detection technology. Pornhub now uses lots of machine learning software, including leading tech from Microsoft, YouTube, and Google, to flag harmful videos.

Pornhub also reviews videos when they’re flagged by law enforcement. Last year the company received 1,081 legal requests, with nearly half those requests initiated from government entities in the United States. Pornhub says most of these requests deal with non-consensual and child exploitation materials.

Outside of harmful content, the second-most-popular reason for a video’s removal last year was copyright infringement. The company receives so many DCMA takedown requests that there’s actually a dedicated team just handling these. All told, Pornhub removed 544,021 videos for DCMA violations last year. Another 106,841 videos were prevented from publishing in the first place, thanks to AI moderation.

Better late than never? — Pornhub found itself the subject of intense media scandal last December after a scathing New York Times opinion piece detailed the ways in which MindGeek, Pornhub’s owner, profited from the hosting of illegal videos. The fallout of that scandal left the site hanging on by the thinnest of crypto-backed threads.

Pornhub’s future is still very much uncertain. Major payment operators like Visa and Mastercard still won’t touch the platform, and the company’s intense moderation scheme is making it much more difficult for amateurs to upload porn to the site. The internet writ large is also still very much struggling to find middle ground between sex and safety. It’s sex workers who end up with the short end of the stick every time.

For years and years Pornhub got away with flaunting legal protections around internet pornography. That unchecked power allowed the site to grow immensely — but it did so at the expense of user safety. Now the company has to deal with the fallout of that strategy. Transparency reports are a good starting point on moving the site forward. Pornhub plans to publish them yearly, now.