Science

How the A.R. Dog Throat-Ripped the Robot Butler

The future of the household interface isn't a bot, it's a hologram that's happy to see you.

DizDau/Flickr

The first earnest attempts to create the sort of robotic butlers that emerged as a pop cultural meme in the 1950s were made toward the end of the 20th century and failed almost immediately. The robotics technology of the time fell well short of realizing consumers very lofty expectations, heightened by the Jetsons’ Rosie and Ray Bradbury’s “Electronic Grandmother.” Instead of anthropomorphic robot servants, people got autonomous hockey-puck-looking vacuums. It was a bit disappointing. What has become clear over the last decade is that even remotely naturalistic general purpose robots will remain out of reach technologically until well after augmented reality, artificial intelligence, and voice recognition make naturalistic interfaces possible and possibly very common.

Cue the Hololens puppy.

To understand why electronic barking is inevitable, it’s important to understand the history. The stalled development of anthropomorphic robots led to a schism in research and development. Task-oriented robots — most of them highly specialized — were created on an industrial scale by private companies. Emotive robots were created largely in an academic context as “Affective Computing” became a discipline. As of now, this has resulted in the development of a vast assortment of experimental, artificial intelligent machine built to give and take emotional cues. But these robots don’t do physical labor. Their work is the simulation of intelligence. That is where their value exists.

Presumably, because pets represent the kind of non-human intelligence consumers are most comfortable with, Affective Computing movement manifested itself in peoples’ home in the form of robot pets capable of emoting with sounds and lights. These dogbots proved easier to interact with than previous personal robots, but hardware bottlenecks remained. Companies struggled to create affordable, realistic-looking robot animals with much, if any, mobility because animalistic locomotion was too complicated and expensive. Products like Zoomer and CHiP kept their wheels and toy-like design, making them less appealing to adults. Whether that lack of appeal led to those products not having practical uses (Amazon’s Alexa, for instance, could easily be embedded in a canid body) is impossible to know. But the lack of agile robotic technology — think of all those Boston Dynamics robots tipping over — made it impossible for robot pets, with their successful if unsophisticated user interfaces, to do useful tasks.

The appeal of robotic pets is that they exist in the real world, but their inability to interact fluidly with that world means that they are, in essence, little different than the virtual pets that exist in the shadowbox habitats behind an iPhone screen. The dog in the Dog Sweete app differs only slightly from the Genibo Robot Dog. That there are no common display devices devoted to hosting virtual pets also seems indicative of the fact that there aren’t currently virtual characters engaging enough that people want them in their homes.

Augmented reality headsets like Microsofts Hololens and set ups like Magic Leap will provide a means of bringing virtual pets into real space — albeit in a somewhat ghostly manner — the question is whether or not people will do that — and on what scale. If past is prelude, we can reasonably expect that many people will embrace and talk to projected German Shepherds.

A.R. pets and companion programs will be able to communicate and interact with us in a variety of ways, serving as both digital assistants and entertainers. What makes the advent of this technology interesting is that it is arriving so far in advance of naturalistic robotics and will offer such a naturalistic interface, that it makes perfect sense these A.R. daemons would come to service as a shared UX platform for personal computing applications, including communications and home control systems. They will exist visually, but their chief medium of practical interaction will likely be sound and voice because we have that technology: Google’s demonstration of a powerful new platform for highly realistic voice synthesis opens the door to new conversational computing capabilities. Verbal interfaces are doubly significant because they provide a means of detecting emotion. With the benefit of procedural kinematics system, the crafting of complex and elaborate reactions to human behaviors could become a hobby or industry in and of itself.

Custom pets or companions will cost extra and may be subject to market trends. One year, we might be hanging out with dragons. The next year it might be mermaids, centaurs, or goldfish. There’s an old Yiddish saying that suggests god created man because he liked to listen to stories. There’s reason to believe that mankind will have a similar motivation for propagating artificial intelligence. The success or failure of A.R. companions will therefore have a lot to do with whether or not we enjoy getting information — reported news, personal communications — from them.

And this is why the A.R. canine feels like such a logical UX. Dog owners often invent attitudes or stories for their animals, talking dogs are a cultural meme, and a good percentage of people are familiar enough with dogs to interpret their body language. A dog, in short, can be reassuring in a way that a fish cannot — and also in a way that a human cannot. That will remain true until such a point at artificial intelligence starts to feel as smart as human intelligence, which could be a game changer.

As digital companions advance in their emotional intelligence and the ability to interact with the spectrum of digital information and applications in our personal domains they will gain an increasingly deep awareness of our practical and emotional needs, intentions, and situations. They will begin to present us with surprising, sometimes alarming, insights about ourselves and may even assume a psychotherapeutic role. It’s unclear at this time if consumers would want to be analyzed by a dog or a person. That might end up being more of a case-by-case choice.

Ultimately humans might win out because projected dogs or cats or horses for that matter would prove frustrating almost by definition. We keep animals as pets — other social animals in particular — largely because they offer us a safe means of satisfying our deep need for physically expressing affection. That won’t work in augmented reality. Given that, there’s a chance that we do eventually end up with the humanoid butlers we always wanted, but they won’t be robots. The timing of augmented reality and all those helpful, insubstantial terriers will see to that.

Related Tags