Virtual reality is a great technology for putting users into immersive experiences, where they can look and play all around them. But although there are some tools for incorporating hands into those experiences, it’s never been possible to bring users’ fingers into the picture.
Until now. Today, Leap Motion, a San Francisco startup that has for years been working on motion control technology, unveiled Orion, its new virtual reality hardware and software platform that for the first time gives users the ability to control a VR environment with finger-level precision.
Initially, Leap Motion is making Orion available to developers interested in incorporating the technology into applications for high-end, powerful VR hardware like the Oculus Rift or HTC Vive. But later this year, the company said, it expects to have Orion technology embedded directly into mobile VR headsets. It is not saying who its first partners will be.
"We think this is the point where tracking [in VR] has reached the level where it actually lives up to the vision we’ve been pursuing for at least five years," says Leap Motion CEO Michael Buckwald.
The release of Orion is a big step for Leap Motion. The company launched with substantial fanfare—and funding—in 2012, in large part based on the promise if its Leap controller, which gave developers the ability to build highly accurate hand motion control into their applications. But the Leap never lived up to that promise. Over the last couple of years, the company has been focusing on VR as the obvious next step for the Leap.
Developers had already been on board, with many jerry-rigging the Leap onto VR headsets like the Oculus Rift in order to bring users’ hands into VR experiences. But because Oculus is making its own technology for that purpose—its Touch controllers—and because the Vive and Sony’s PlayStation VR also offer that technology, it seemed Leap was getting left behind.
The Orion upends that equation. None of the other technologies have the ability to see users’ fingers, whereas Orion can tell exactly what finger someone is using, allowing for subtle motions like pinching, or picking something up with just a couple of fingers. It can tell the difference between users’ left and right hands, and can even distinguish between the hand and, say, a table that it’s resting on.
That means, says Leap Motion cofounder and CTO David Holz, that developers will be able to write software that understands which fingers are being used and what they’re doing.
In a Orion demo, Holz showed just how much control the technology offers. First, a user can create blocks just by putting two pinched fingers from each hand together and then pulling outward. Then they could pick up the blocks with a thumb and index finger and place one atop another, piling them higher and higher. The demo also allowed a user to pick up and throw blocks, flick them with a finger, or even grab one and swing it like a baseball bat.
Where Orion could really shine is in mobile headsets. For example, there’s no easy way to bring a user’s hands into an experience on Samsung’s Gear VR. One company, Pantomime, has come up with a way to do it, but it’s certain that if Orion is embedded in the hardware, it will be much more seamless. But still, there’s no guarantee Leap Motion is working with Samsung.
Holz thinks that Orion won’t replace technology like the Oculus Touch, which is ideal for things like Oculus’s Medium software, a 3-D painting and sculpting program, but rather augment it.
"If a controller is trying to be a hand, it’s going to be a bad hand," he says. "If a hand is trying to be a controller, it won’t necessarily be a good controller."