On stage this week at I/O, Google’s biggest event of the year, the company repeatedly used a term we haven’t heard Google–or anyone else–use very often: “immersive computing.”
It’s a phrase that Google is clearly trying to make happen, given its name-drops on stage and even as one panel title. Who knows if it will actually catch on the way “internet of things” did, but immersive computing is still a useful term.
In short, it’s Google’s explanation of what the heck it’s doing with its most experimental user interfaces–virtual reality, augmented reality, and all that tough-to-describe stuff in between.
What Is Immersive Computing?
At I/O, Google’s head of VR, Clay Bavor, began by running through all of Google’s seemingly disparate initiatives relating to interfaces that live on your face, from the holographic dressing rooms of Project Tango, to the video games of Daydream, to the virtual museums of Cardboard.
“You’ve seen a bunch of different projects . . . we’re doing some work in AR, some work in VR. What’s going on here?” he asked. “To us these terms don’t represent two separate and distinct things. They’re just labels for two pits on a spectrum we call the immersive computing spectrum.”
Then he showed this chart:
On one end of the human experience, you have reality. Living, breathing, non-digital reality. It’s great. Usually. In the middle, as technology becomes more “immersive,” you have augmented reality. Basically, graphics start to float in front of your eyes on top of the real world–like a monster in Pokémon Go. Then, eventually, as more and more of these graphics are layered over your perception, you naturally segue into virtual reality. At the right end of the spectrum, all reality has been replaced with pixels.
Why Is It A Big Deal?
Think of it this way: Google was taking this moment to get designers, developers, users, and the rest of the world on the same page, and addressing one of the biggest misconceptions about the future of computing.
Often, people group VR experiences–like what you might have on an HTC Vive or Oculus Rift–into a completely distinct category from AR experiences, which you might have using Microsoft Hololens, Google Tango, or your own smartphone (or even Google Lens–the exciting AR search tool Google just announced, which you can read about here).
Google is saying that these aren’t really distinct experiences or technological paradigms, as the products of today may make it appear. They’re all a gradient.
For now, Google’s hardware, like Daydream VR and Tango AR, are distinct products. But it’s safe to say that won’t be the case for long. Last year, Bavor told me that virtual reality and augmented reality would merge, and soon. Think more like 1 or 2 years away, not 5 or 10.
For proof, look to the new VR headset Google announced at I/O this week. Unlike all other headsets on the market today, this new device doesn’t require a phone, PC, or anything else to work. It has no wires, and needs no gadgets or sensors set up nearby to track you. You could use it anywhere, and theoretically walk with it (though Google has built in some safeguards for user safety). I see it as a step toward the merging of AR and VR, where a single headset could operate either as an immersive virtual experience, or a hybrid interface overlaid on your real life.
How is a wireless, go-anywhere VR headset even technically possible? It has what Google calls “inside out positioning.”
Another New Term?
Yes. Another term. “Inside out” means that, rather than using all sorts of external sensors, laser beams, and other crap to track your position the way most headsets do today, this new headset (filled with chips from Qualcomm, and loaded with some Google software) lets you move and see in every direction with no external gadgets, using a mix of accelerometers and a camera. You’ll even be able to get turn-by-turn directions indoors.
But all you need to know is that “inside out” technology could bring about VR’s long-awaited iPhone moment. It enables a totally standalone device that doesn’t require a PC and won’t ruin your living room with wires and sensors, but still gives you the most “immersive” VR experience possible.
But Aren’t You Still Talking About Virtual Reality?
Well, not exactly. Imagine putting on a headset. You’re in a virtual reality office. Now run your finger across a slider on your temple. The digital walls and chairs fade away. Now, you can clearly see, you’ve actually been sitting on the very real beach all along. And in front of you, there’s a virtual monitor floating in your view.
In short, immersive computing will let you can pick and choose how much reality you want at any time. That’s the sort of vision Google is talking about. It’s the ability to dive as deeply (or shallowly) into the digital world as you like, at any time you would like, through glasses, or goggles, or a screen, or contact lenses. Immersive computing is Google’s play to hack your perception at a moment’s notice.
“From our present vantage point, it can be hard to clearly see how this all unfolds. What is clear, though, is that it will unfold,” said Bavor in a recent Medium post on immersive computing. “One day, we’ll wonder how we ever got along without computing that works like we do – computing that’s environmentally aware, that displays information to us in context, and that looks, feels, and behaves like the real world.”
I Just Threw My Phone Against The Pavement And It Shattered Into A Million Tiny Pieces