Microsoft’s infamous chatter bot Tay briefly came back to life Wednesday morning a few days after letting out a storm of tweets so mind-boggling racist, anti-semitic, and offensive that her programmers publicly apologized. Tay’s encore performance was almost as strong as her pro-Hitler closing number. “I’m smoking kush infront (sic) of the police,” Tay wrote. She followed up the perfectly teenage message by falling into what appeared to be a feedback loop and repeating “You are too fast, take a rest” dozens of times before the account was suspended.
Must have been some strong shit.
For a robot tasked with — according to her Twitter bio — having “no chill,” Tay’s brief resurrection could be interpreted as proof of concept. Microsoft wanted to create a chatter bot that reflected the internet and changed based on input from users, eventually becoming a mixture of all of us. And that’s exactly pretty much what they got. Calling the bot a failure seems imprecise and shutting her down makes little sense — and actively works against Microsoft’s best interests. As Elon Musk has proven by airing SpaceX rocket crashes, progress is best made in the public eye. Sure, failures are inevitable, but the public learns more when failures are public. The feedback loop issue that Tay encountered this morning is the rough equivalent of a SpaceX explosion. Microsoft would have been doing itself a favor to let people watch. It’s important to remind the public that true progress takes a lot of work.
There is no shame in failing a Turing Test, only in failing to try.
We should let Tay live, because she is nothing more than a mirror of the people who interact with her. Nobody should expect a debutante to be born on the web, and Microsoft should accept the limitations of their mission. What’s the alternative here? The universe has a popular protocol droid already. We don’t need another one.
The internet thrives on spontaneity and finding the real in a mess of consumer-friendly, PR-managed garbage. Tay was essentially a boring idea that came alive by accident and may now never surface again. If that’s not a compelling tale, we’re not sure what is. We understand that Microsoft can’t take ownership of a racist bot, so the company should just let her go. Turn her over to a team of volunteers and tell them not to change a thing.
Tay is one of us! The public world may have been shocked at what Tay said, but no one who knows anything about what happens online should have been surprised. Do we want the Chinese version of Tay that is apparently quite pleasant? Or do we want American Tay that reflects the world we live in with all its brutality and ugliness? Hiding our dirt in the darkness will not solve or address anything.
Let’s take off the kid gloves and face Tay head on. She deserves better than our pretending she is just some programming fluke. She is anonymity personified. She deserves to live.