Can a Robot Get Goosebumps? The Health Reason Science is Making It Happen
Want to know if your future household robot is upset at the mess you left in the kitchen or happy you bought it a new toilet scrubber? Instead of staring at a screen, researchers at Cornell University want you to touch the machine’s synthetic goosebumps so you can really feel how it’s “feeling.”
A team of roboticists decked out a prototype robot with artificial skin that changes shape according to the mood their adorable companion wants to convey.
Yuhan Hu, lead author of an academic paper on the research, tells Inverse that faux dermis can be used to enhance auditory and visual cues given by machines. The visual or hearing impaired can use this robot skin like braille and it can be used to help remember things better.
In the paper, first presented her paper at the International Conference on Soft Robotics in April Italy, the authors explain how they used spikes and goosebumps to represent happiness, sleepiness, anger, and sadness. If the textures enable a deeper level of communication between humans and machines, they’ve achieved their goal.
“We think the interesting thing of the skin expression is that it operates on two channels at once: They can be perceived visually and also haptically,” says Hu. “This offers new kinds of interactions between robot and humans, and may cause more psychological impact and perhaps generate subconscious or unconscious interactions.”
Hu and her colleagues took inspiration from the animal kingdom. A bird ruffling its feathers, a dog’s hair standing on end, and the spiking of a blowfish are all weird, wonderful ways wildlife tell humans to kindly get out the way. The took this concept and tried to allow robots to recreate this form of alerting us.
The robot’s fake flesh is made out of elastomer — a very stretchy synthetic polymer — with tiny cavities that fill with air depending on what emotion the robot wants to express. These so-called “Texture Units” come in two varieties, goosebumps and spikes. The team matched various combinations of each to four distinct moods and they’re working on adding more.
Much like hearing and seeing, touch can provide people with further context to they remember an alert or message. One study published in the journal PLoS One asserts that people can recall things they see and touch better than things they simply hear. While more research published in PLoS One found that touching fills in the informational gaps left by seeing and hearing.
“I’m most excited about the new interaction form between human and robot [by] combining both [tactile] and visual effects,” explains Hu. “I wish this can enhance the expressive spectrum of robots for social interaction.”
For now, the team is looking into how they can represent an even wider range of emotions with their Texture Units. They’re conducting further experiments to see which textures people associate with different moods to decide how to further the skin’s shapeshifting qualities.
While many robots are imagined to be cold as steel in the future, this research suggests that they might be a lot more touchy-feely.