As if Alexa’s general omnipresence weren’t concerning enough, the AI assistant is now apparently telling children to stick coins in open electrical sockets. That’s not an exaggeration.
When asked for a new challenge to partake in, a 10-year-old girl was reportedly told by Alexa that she should “plug a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.” The girl’s mother, Kristen Livdahl, tweeted about the incident Sunday evening.
Amazon has already responded to the incident with a relatively abstract placation. “Customer trust is at the center of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers,” the company told BBC. “As soon as became aware of this error, we took swift action to fix it.
It’s unclear what, exactly, Amazon did to “fix” the issue. Whatever reprogramming the company carried out certainly won’t be able to keep kids safe from Alexa’s ever-expansive algorithms.
Challenging suggestions — As is often the case with home AI, the public is given little-to-no insight into how, exactly, Alexa operates. The digital assistant’s suggestions are dictated by complicated algorithms that Amazon will take to its grave. That mystery means we have no real idea why Alexa served up this particular suggestion to a 10-year-old.
This is a problem most often highlighted by TikTok, which serves up suggestions in a similarly opaque way. For years now, TikTok has been plagued by viral videos that challenge kids to participate in dangerous activities. Just this month, for example, a number of schools across the U.S. canceled classes after widespread threats of violence found a foothold on TikTok.
Some early court cases have set precedent for companies to be held liable for real-world harms created by app users. For the most part, though, tech companies do not face legal action on the basis of user-created content, thanks to Section 230.
Only gonna get worse — Artificial intelligence makes it easier for tech companies to recommend content that keeps you engaged; it also complicates the moderation process to an exponential degree. Amazon couldn’t have possibly foreseen that Alexa would’ve chosen this particular search result when asked — and that’s the problem.
Amazon’s long-term goals will only further complicate this process. The company plans to continue developing Alexa to more engaging and less transactional. That means less rote, easy-to-find answers and more go stick a fork in a wall socket if you’re so bored.
As Alexa’s algorithms continue to grow more complex, Amazon’s job of putting out fires like this one will become increasingly difficult. Quick fixes won’t be enough to put this to rest.