Roboticists are beginning to realize that their attempts to find the perfect robot design for human interaction have all been for naught, since it turns out that there’s almost certainly going to be multiple perfect robot designs for human interaction. New research from Penn State University shows that people want robots to act differently in different contexts, and their preferences were not at all in line with expectations.

It turns out that robots with a more cheery disposition make great helpers, as you’d expect — but that becomes creepy or just plain annoying in a personal companion robot. It seems nobody wants to feel guilty for ordering a dour servant-bot to do some chores, but something a little more nuanced than pure happiness is needed for building an emotional connection to a pet, or even a friend.

The finding will be important to industrial designers all over the world, as they try to figure out how to subtly influence people’s reactions to technology. One very common approach has been to exploit the human preference for faces, but this study shows how there are likely limits to how widely effective that strategy is going to be. At times people will want a cartoonish smiley face, but at others they might want a calmer presence, or even a totally inhuman machine with little personality at all.

Much research in robotics focuses on the ability of robots to read and respond appropriately to changing human behavior, but it could turn out robots should actually be simpler, more focused on a narrower set of social and emotional attributes in line with the particular function it serves. In this study, the researchers changed the voice of the robot to give it different “personalities,” but the effect could be much more pronounced if expressed through a wholly distinct physical design.

These and similar findings are likely partly due to the fact that most of the people tested on these robot emotional scales are elderly people, since they will be the primary early market for robotic assistance. People will probably want at least some of their basic mood and mentality reflected back at them when forming emotional connections, so it’s hardly surprising that we’d see a preference for emotional complexity among elderly people who are motivated to try to use a robot as a friend.

And so, as is so often the case with technology, the process of interacting with an aspect of humanity becomes the process of the understanding that aspect of humanity — something we’ve been trying to do for some time. Under the guise of industrial design and consumer electronics, there is some truly intriguing work going into friendly, emotive robots. Soon enough, there may be research going into neutral robots, as well.

Photos via Getty Images / Ethan Miller