People sext with their smartphones. It’s just what they do. Ideally, these photos are kept between all intended parties, but as we’ve seen, even celebrities are not immune from having their nude photos passed around unwanted eyes.
For most problems, there’s a technology that aims to a provide quick, simple solution. App developer Jessica Chiu heard directly from actresses who wanted know that their sensitive photos would be safe when she was working on a film project, given there’s no shortage of mouth-breathers who’d seek to obtain their photos and distribute them for edgelord cred. And so, Chiu was inspired to create an app that securely stashes private photos, and recruited Y.C. Chen and Edgar Khanzadian. Together, they created Nude, which launched in late October to much fanfare but has since been pulled from the app store for reasons unclear.
There are plenty of safe-lock apps on the market, like Safe Lock and Private Photo Vault. But Nude is different for a few distinct reasons; it’s name alone proudly declares its intended use, rather than tamer apps that couch their language with emphases on “privacy,” “security,” or “messaging.” Nude is absolutely for nude pics.
The app also uses iOS 11’s new CoreML technology, which enables app developers to use machine learning to identify image types and file them accordingly. The images do not need to be sent to an outside server (the cloud) for identification, and CoreML allows the image to be identified on the device itself. This technology allows the app to scan your phone for well, nude photos for in-app safekeeping. In order to teach the CoreML how to recognize nude photos, the team loaded with a database over over 30 million images, including many from porn sites. It’s the same technology that led supermodel Chrissy Teigen to discover that “brassiere” as a search term brings up, well, photos of women wearing bras.
Once the Nude app locates the sensitive images, it then transfers them to a secured in-app folder, and deletes them from the camera roll and, crucially, from iCloud. Photos are safe within Nude, and away from prying eyes. An app doing the Lord’s work, if we ever found one (other than Seamless).
So why did Apple remove Nude from the App Store? Chen, one of the co-founders, tells Inverse that Apple had given them mixed messages regarding their terms of service.
“Nude was taken down from the App Store for the very bizarre reason that “the App Store Review Guidelines are a living document, which can result in new rules at any time,’” he says. “It was peculiar because according to the messages we received from Apple, ‘Locking Photos’ is no longer a compliant use case for an app. It is extremely puzzling - apps like Snapchat or KeepSafe (which have millions of users) are all having similar feature/use case.”
Chen then said that Apple followed up with an altogether different request: “What gets even more interesting, [Apple reps] have since called me and we are now told that we will be ok if we were to take out the ‘Sexiest App Ever’ branding. Going forward, our app will simply appear as ‘Nude App’ on the app store.”
Inverse reached out to Apple for comment, and will update this story when we receive a response.
A Nude representative adds, “we are currently negotiating with Apple to get it back on [the iTunes store]” and that “we are expecting this to take no more than a week.”
This is because the Apple iTunes store has long been a G-rated world: Developers are not allowed to list apps that contain any adult content. It’s why female nipples are banned on Instagram — the app would be removed from the app store if it allowed bare breasts to be shown. Given that Apple has apparently decided that female nipples are too hot for iOS, it’s not surprising that they terminated the Nude app for a branding violation.
Chen did say that if Nude was re-listed on the Apple App Store, it would simply be called “Nude App.” Not quite as salacious as their original tagline, but much more amenable to the notoriously prim Apple.