Augmented reality—superimposing virtual objects on a smartphone screen or a headset’s view of the world—originated long before Pokémon Go and could take off in the very near future. Research firm IDC just released a report predicting that the AR/VR market will at least double every year through 2021. Qualcomm, the largest maker of processors for Android smartphones, is betting big on AR as well as VR. Today, it’s introducing new camera technology called active depth sensing to enable AR/VR and other features in smartphones and many other devices.
Qualcomm’s active depth sensing projects infrared light to map distance and 3D contours of an object or location, with precision to a fraction of a millimeter, the company claims. That allows an AR app to convincingly position images of imaginary objects in a view of the real world. “To truly fool your senses, it has to be perfect right on,” says Keith Kressin, Qualcomm’s senior VP of product management. The tech could be equally valuable for VR headsets, warning people in fake worlds when they are about to hit something real.
Another application: Letting you unlock your smartphone just by looking at the camera, eliminating the need for a fingerprint sensor. Such “face unlock” tech is already in some smartphones with garden-variety selfie cameras, but isn’t as secure as other methods. With precise depth mapping of a face’s contours and the ability to spot tiny, fleeting muscle movements, however, the camera won’t be fooled by a photo held up to the camera or (if anyone goes to the trouble) a mask or 3D print of a face, says Qualcomm. (There are some hints that Apple will bring similar tech to the expected iPhone 8.) Beyond consumer gadgets, the tech will also find its way into self-driving cars and other robots, Qualcomm hopes.
With dominance in Android smartphones, deals with big VR headset makers like HTC and Lenovo, and patents going back a decade—it once owned AR pioneer Vuforia, and retains some of its intellectual property—Qualcomm has a good shot at dominance in AR and VR. Ironically, though, the company might get a major boost in these markets from Apple, which powers its mobile devivces with its own A series processors rather than Qualcomm chips and which is currently engaged in a legal tussle with the company over patents and licensing. When the next iPhones debut, probably in September, they’ll run iOS 11, with advanced support for augmented reality apps. Long before that, at its Worldwide Developers Conference in June, Apple introduced ARKit, software for developers to start building AR apps in time for iOS 11.
“Tim Cook has said, and it’s true, when iOS 11 ships, it’ll instantly become the largest AR platform out there.” says Tom Mainelli, an analyst at IDC who covers mobile devices, AR, and VR. The predicted AR explosion in Apple products could create enthusiasm for Android buyers, too, says Mainelli.
While Apple may have the lead in publicity and software, Qualcomm might have an edge in hardware, especially with the new camera system it’s introducing. There are several ways to map a 3D space, starting with one or two basic cameras and algorithms to analyze clues in the appearance of objects—basically what we do with our eyes and brain. Qualcomm supports that method. But its new technology goes beyond what humans can do by projecting infrared light, allowing it to calculate contours in its field of view to within less than a millimeter of accuracy, according to the chipmaker. (Since infrared light is invisible, humans don’t see a thing, and the tech works in pitch-black settings.)
The resulting point cloud, as it’s called, isn’t a new invention. Standard cameras can produce them (I recently wrote about such a system for self-driving cars), as can infrared cameras like the ones used in the Xbox’s Kinect. The infrared tech in devices like the Kinect uses time of flight, calculating how long it takes an infrared beam sent from an emitter to bounce off an object and back to a camera next to the emitter. It’s essentially radar with light. (Sources tell Fast Company that Apple may have chosen time of flight tech for future iPhone depth sensors.)
Qualcomm’s bragging rights are in the level of accuracy. Instead of time of flight, it uses a method called structured light, also not a new invention, but one the chipmaker claims to have refined to high accuracy and low price and power consumption.
The infrared light passes through a filter, creating a detailed pattern projected in front of the camera. As it hits objects (or people) with different shapes and contours, at different distances, the pattern gets distorted. “The infrared camera is looking at the distortion and warping of these dots,” says Philip Jacobowitz, Qualcomm’s senior product marketing manager for camera and computer vision. “Let’s say an object is closer to you. These dots are going to show up slightly larger.” An infrared camera photographs the pattern, and, 30 times per second, Qualcomm’s processor analyzes the distortions to generate a three-dimensional cloud with over 10,000 points, accurate to within a tenth of a millimeter, claims the company.
In samples from Qualcomm, the point clouds look as detailed as a grainy photo or video. They resemble point clouds from very high-end lidar systems, hefty laser scanners used in self-driving cars. Even the smallest lidars are about the size of a toaster, and they cost thousands of dollars. Qualcomm’s new Spectra system won’t be a cheap replacement for lidar, though, as the later can map objects 200 meters away or more. Qualcomm’s system is accurate within a range of roughly three meters (about 10 feet), which might still be helpful for backup sensors or to judge distances from future autonomous cars and trucks driving in tight formation. It could be even handier in consumer and industrial robots that move in closer quarters.
The Waiting Game
Don’t expect to see the new Spectra camera system tomorrow, or even this year. Qualcomm typically announces components before they go into mass production, and it takes even longer before they make their way into final products like smartphones or VR headsets. In this case, the delay will be even longer. The depth sensing tech requires not only Qualcomm’s camera system but the latest version of its image signal processing (ISP), called Spectra, in a new chip that probably won’t even be announced until the end of the year. “What’s happened that we can do this now?” asks Jacobowitz. “Much of it has to do with this new image signal processor…that can handle all this depth information.” And since a Qualcomm processor is required, there’s probably no chance you will see the IR projector-camera combo make its way into future iPhones.
Qualcomm’s top chip right now, the Snapdragon 835, powers high-end smartphones like Samsung’s Galaxy S8. It will also be in VR headsets from HTC and Lenovo using the Google Daydream platform as well as an upcoming standalone version of the HTC Vive that doesn’t need to tether to a PC. (Qualcomm will not comment on rumors that Facebook has selected it to power an untethered version of the Oculus Rift headset coming in 2018.)
Qualcomm’s new ISP won’t be available until it’s built into the next-generation Snapdragon, which is yet to be announced and may not make its way into phones until at best the first half of 2018. While the system works best with infrared-projecting active depth sensing, the new Spectra image processing will improve the accuracy of “passive” systems that use regular cameras, says Qualcomm.
No matter how good its camera tech, Qualcomm will have to compete with several heavyweights on various levels. If apps built using ARKit truly capture the public imagination, even on phones with less advanced imaging hardware, Apple’s platform could become the de facto standard for AR. In headsets, Qualcomm is also going up against Intel, although Qualcomm has a good head start, says Mainelli. The only thing that is certain about augmented reality is that it’s destined to get very big, very fast.