Science

Augmented Reality Is Getting Much Better at Recognizing People's Bodies

The end of the green screen.

by James Dennin

The green screen is still a requirement in even the most sophisticated augmented reality applications. Take the example of the viral Weather Channel video that allowed the anchor to describe Hurricane Florence’s storm surge as animated flood waters appeared to rise up around her. That video was made using graphics that were rendered in real time and projected onto the studio’s green screen walls.

But the fact that you need a green screen at all is just one example of one of augmented or mixed reality’s main limitations, which is that computers still can’t really tell the difference between people and objects or plains, Justin Fuisz co-founder and CEO of an augmented reality video company called Octi, tells Inverse.

“Historically, your Apples, your Snapchats — a dancing hotdog or a Pokemon — they use open spaces and they use fixed objects, like this table, things that don’t really move… to anchor the space,” Fuisz said. “What Octi has built up is a series of neural networks that work together to give progressively more information about what someone’s doing in a video. We track a skeleton, and re-construct that skeleton into three dimensional space”

The ability to read bodies enables Octi to build a lot of effects into its app that you can’t quite see elsewhere. You can project bodies onto a background, for example, and make people look like they’re invisible. You can “make it rain” money because the app can spot your hands and tell when they’re moving around.

How People-Spotting A.I. Works

But a consumer app is obviously not the end game here. Octi envisions its technology being used in applications ranging from sports and health tech to workplace safety.

Apple is also interested in how augmented reality can be applied to sports tech. During its most recent product launch, it demoed HomeCourt, an app tailored to basketball players which can track which jump shots you make and which ones you miss. Octi would have us imagine apps that could track everything an athlete does on the court without requiring the focal point of a hoop. It could perhaps anticipate when you’re setting yourself up for an injury and suggest improvements. It could make stat aggregation that makes things like Fantasy Football possible far more immersive and in real-time.

Apple demoed an AR app called HomeCourt at its most recent keynote that uses AR to track shot performance in basketball. 

Apple

There are also a number of potential industrial implications, particularly at the intersection of health tech and worker safety.

“You imagine people in a factory lifting things or getting hurt, and you could apply this technology to existing cameras to identify if someone has been injured, or if they’re lifting correctly,” Fuisz said. “It’s an efficient solution without being intrusive on people.”

Augmented reality hasn’t always kept up with outsize expectations. But that, again, might have more to do with the amount or expense of the hardware needed than with what the hardware can actually do. People wearing Google Glass were quickly dubbed ‘glassholes,’ and immersive VR requires expensive headsets. The Magic Leap One, geared toward creators starts at $2,295.

Those are pretty big barriers of entry, particularly relative to a smartphone. But the more people who are capable not only witnessing AR or seeing it in a fancy demo, but actually applying these techniques and experimenting, might be what the sector needs for the hype to become something closer to reality.

Related Tags