Biometric data and facial identification technology’s ramifications continue to rapidly evolve, presenting a host of concerns regarding privacy, ownership, and discrimination. Earlier this week, one of the world’s biggest photo content licensing companies, Getty Images, announced its new “enhanced model release form” that attempts to deal with at least some of these issues — unfortunately, it’s less a major step forward in protecting people’s likenesses as much as it is a reminder that this troubling tech is here to stay, and vague safeguards are the best we can get right now.
“I... agree that [a model’s images] may be altered and used to develop and improve machine learning algorithms, artificial intelligence, and other technologies,” reads the new contract, later adding that a model must accept that their biometric data could be used in these circumstances “except for the purpose of unique identification.”
So yeah, if your face ends up in Getty’s databases, there’s a decent chance it’ll be scraped to improve controversial surveillance tech, but at least they won’t be able to know your name, right?
Hoping to spur change — Speaking with VentureBeat, Getty Images’ director of advocacy and legal operations counsel, Paul Reinitz, explained that they “hope for it to be widely adopted and signed by models who feature in new commercial images and videos on the Getty Images and iStock websites.” Reinitz also made clear that they intend Getty’s waiver and contract to become a standard format within content creation industries.
“We must recognize that the increased use of biometric data contained in imagery to train AI/ML applications requires the need to ensure that we have obtained the model’s permission to use their image and data in this manner and Getty Images is at the forefront of addressing these very real concerns,” he added.
Admitting defeat — Rather than make a wholesale promise to not sell their image libraries to biometric data companies, Getty is openly admitting that it’s simply too lucrative an avenue to reject. Instead, they hope to protect models’ personal identification information as best they can while doing so. Basically, they’re doing the bare minimum while waiting for governments to make their own regulations regarding the emerging technology. It’s not our preferred outcome, but it’s certainly not surprising, either.