Using AI to predict the year's top-selling toys has disturbing results

An algorithm was fed the top-selling toys and gadgets of the last 50 years and asked to come up with product names for 2020's big hits.

"The Oasis Project" is an AI-generated game concept.

What happens if you ask an algorithm to generate ideas for new toys? You learn pretty fast that algorithms aren't great at making kid-friendly toys.

Using OpenAI's controversial GPT-3 language model, ad agency Adzooma inserted the names of the top-selling gadgets and toys of the last 50 years and asked the AI model to come up with a list of names for new products. The agency then tried to design packaging for the products. The results are frankly pretty weird, like a game called "Tear it Out!" that's a new spin on Kerplunk where you, well, tear the hair out of a plastic head.


Another product, called "Hair Babies," seems to literally be a... hairy doll. The packaging features the doll disturbingly exclaiming, "Brush my face!" Which even for the bearded among us is an unsettling proposition. It's not helped by another line of text on the proposed packaging that reads, "Real tears... every time!"


It's all pattern matching — While GPT-3 can construct sentences that make some basic sense, the examples above demonstrate that it's not good at common sense. Whereas a normal person can understand that we shouldn't be encouraging children to commit violence on others, a computer algorithm doesn't really understand that pulling out someone's hair is probably not a skill or leisure activity we want to be encouraging children to partake in.

That's because the algorithm is just learning what words or phrases are often found together in product names and creating new names based on those patterns and prevalences. As is often the problem with automated systems, nuance is difficult to grasp, and there's no understanding of why certain ideas might be controversial or why some topics may be sensitive. Lest we forget Tay, Microsoft's chatbot that turned into a racist, sexist, anti-Semite in mere hours from launch.


Fine-tuning — AI is good at inferring from large data sets, but it's awful at subtlety. That's not to say it won't get better, because it will. We're still in the early days of algorithms and machine learning. Someday soon algorithms may be good enough to understand sensitives and then toy companies will be able to pump out new products with ease. Until then, we get to laugh at gems like "Hey Snowflake," which looks a uncannily like something the Cards Against Humanity team might release.