Science

Why Your Brain Thinks Virtual Reality Is Nauseating

Some scientists think they've found a fix.

Filckr user Sergey Galyonkin

The anticipation around Oculus Rift, Microsoft’s HoloLens, and other virtual reality devices is reaching a fever pitch. But companies still have to to contend with a reaction called virtual reality sickness, cousin to motion sickness, which can turn a user’s experience sour in an instant. The CEO of Oculus VR, Brendan Iribe, has even warned rival companies about “putting out product that isn’t quite ready.” That’s code for: We’d all appreciate you not making people associate VR with projectile vomiting.

They might be fightin’ words, but Iribe raises a good point. Cybersickness might keep a consumer from playing a VR game, and that’s not exactly a big loss. But the risk of making someone barf their guts out could keep people from harnessing the potential of VR headsets for more important applications, like treating PTSD, training soldiers, or providing medical therapy to stroke victims or amputees.

You would recognize cybersickness effects if you’ve ever gotten sea-tossed on a small boat: headache, nausea, disorientation, and pretty much every other bad feeling you can recall from the teacups at Disney World. Whereas motion sickness is caused by a discordance between the body’s motion and its perceived motion, cybersickness is induced almost completely by the disparity between what your eyes see and what your head feels. Virtual reality might trick your sight, but if it’s off even slightly, your inner ear knows better.

The reasons behind this are still unclear, but the most prevalent hypothesis is “sensory conflict theory.” A VR device might replicate some sort of experience involving a strong degree of self-motion — say, a rollercoaster ride. Through your eyes, you’re meant to feel like you’re moving up and down, side-to-side through a three-dimensional space, at high speeds. In reality, you’re sitting completely still. And because you know what it’s like to actually be on a rollercoaster, your body has trouble reconciling what it sees and what it expects based on prior experience. This manifests as a physically disorienting experience, and before you know it, you’re scrambling to find a barf bag.

It’s a problem that Oculus VR and other companies think they’ve been able minimize by having users sit rather than stand, or by getting users to ease into the experience slowly by first acclimating to less intense games or experiences.

Researchers at Stanford are trying to come up with a better solution. Gordon Wetzstein and his colleagues at the Computational Imaging Lab are developing headsets that use what they call light field stereoscopes — basically a stack of LEDs that help produce a light field that gives the user focal cues, making virtual images look more natural. Images start to look more 3-D, giving them a realistic look that minimizes disorientation.

It’s an inexpensive trick that could greatly improve VR experiences. The first VR headsets, led by Oculus Rift, hit the market early next year; if companies start to adopt the Stanford team’s technology, it won’t be till the next wave of models. Hold onto your guts.