Facebook's Thought Translator Could Invent a New Form of Language

Getty Images / Justin Sullivan

Imagine you’re at Coachella 2037, and you’re separated from your friends. You reach out and think to them to ask where they are standing. A second later, you feel a buzz on the skin of your arm, telling you they’re near the bathrooms. That might sound crazy, but it’s a scenario Facebook’s completely out-there brain devices could someday make into a reality.

On Wednesday, Regina Dugan, the head of Facebook’s secretive Building 8 group, announced that her team is working to create a system to turn thoughts into messages that a recipient can hear through their skin. Dugan readily admits that the system she outlined at the F8 Facebook Developer Conference in San Jose is still, technologically, a long ways away, but the concept could open up a way for a new form of physical language.

In the experiment Dugan showed at the F8 conference, Francis, an electrical engineer acting as a test subject, feels words on her skin through an actuator with 16 different frequency bands. “Francis currently has a tactile vocabulary of about nine words, she learned them in less than an hour,” Dugan said. And while you probably learned more than nine words in a single period of high school Spanish, the fact that Francis is able to feel words based on electric frequency is pretty cool.

Francis is wearing the actuator on her left arm, and in the experiment is shown getting messages about the different shapes in front of her.


Dugan explains that the electrode arm-sock effectively turns the skin into a cochlear device, with a few small distinctions. A cochlear device turns sound into electricity that the brain can interpret as sound, while bypassing the ear — you might have seen the dramatic YouTube videos of deaf people “hearing” for the first time with them. This device takes a text input, turns it into the electrical signal on the skin, which the brain can interpret as sound.

“If you ask Francis what she feels, she’ll tell you that she has learned to feel the acoustic shape of a word on her arm. The word black moves from low frequency regions to high frequency regions, the word blue is more localized. She processes these shapes in her brain as words,” Dugan said.

While using the arms to turn electricity into sound is new technology, using nerves elsewhere in the body to access the different regions of the brain is pretty solid science. In 2008 researchers were able to use electric impulses on the tongue to make up for damage in a woman’s inner ear that destroyed her balance. So although Francis only knows nine words through her skin now, the proof of concept is a strong indication that this is something that could actually work.

Going from thoughts to text that can zip across your skin is a bit more of a hurdle. Elon Musk has taken on getting thoughts to interface with artificial intelligence, but he is planning on doing it through an implanted neural lace device inside a person’s brain. Facebook, at the moment, wants to enable that interface non-invasively (i.e., not sticking anything into your brain). We don’t have the technology to translate thoughts just yet, but it’s an ambitious task to take on that is maybe crazy enough to work.

There’s an intimacy in getting a message that no one around you can see or hear. Even without the technology to translate thoughts, being able to hear messages without them showing up on a screen, or having to be able to physically hear them changes language. The full brain device moonshot creates a way of communicating that would leave no trace of a message, which is something that people have never had in long-distance communication. Even if you have to text the message that someone then hears, it’s an idea that’s tingle-worthy, and maybe not that far away.