Facebook-owned Oculus wants very much to be the most important player in the still-nascent consumer virtual reality industry. It also wants to be the leader in both VR and wearable augmented reality five or even 10 years from now—and pursuing that goal will likely mean, to paraphrase Arthur C. Clarke, a pursuit of magic.
That’s why the speech by Michael Abrash, the company’s chief scientist, is always one of the most popular talks at Oculus Connect, its annual developers’ conference in San Jose that kicks off today.
Abrash’s speeches tend to be long, complex musings on the state of VR and AR today and the technologies that will be required to get the industry to the pervasive, always-on, all-powerful devices many expect us to be using in a few years. Today’s talk, actually a discussion with journalist Steven Levy, was no different, reflecting in more than half an hour of comments much of what Abrash had written in a 5,000-word blog post published today.
At the core of the post were his thoughts on numerous devices that he–and clearly Facebook–hopes will one day make their way into users’ hands. The most consumer-ready was a VR headset featuring a technological approach that could one day make it possible for people to wear such a device comfortably all day rather than for a maximum of an hour or two at a time.
That approach is what Oculus researcher Doug Lanman calls varifocal, “where the lens deforms or moves relative to the screen in order to alter the focal distance,” in effect solving depth of focus issues that Abrash says, thanks to the fact that lenses currently focus at a fixed distance, cause visual fatigue and discomfort after too much use.
In some companies, research divisions work on long-term problems; in others, their work goes directly into product. Oculus Research, Abrash wrote, “sits somewhere in the middle, looking for breakthroughs on high-impact, genuinely unsolved problems, but always with an eye to getting the results out into the world.”
That’s why, though an actual product version is likely quite some ways off, Lanman’s team of 40 researchers in Redmond, Washington have developed a working prototype of his varifocal headset. Abrash characterizes its potential in rather grandiose terms: helping “millions of people to comfortably work with virtual objects within arm’s length and read in VR for however long they want.“
Whether that comes to fruition is impossible to know today, but when you read Abrash’s words or hear him speaking them on stage, it’s hard not to be convinced by his confidence, his whimsy, and the sense he projects that all technical problems can be solved if only the right group of extremely smart people work on it.
Consumer VR’s commercial emergence in the last couple of years was at first met with the kind of hype and enthusiasm you’d expect from a futuristic technology that takes people to all-new worlds.
But over the last year or so, slow sales of most VR devices have tempered that hype, and it’s fair to say we’ve now entered a typical backlash stage–complete with widespread doubts about whether the technology can ever become mainstream, or at least that it’s going to take many years to get there.
Those realities may be troubling to shareholders of companies like Facebook, Google, Microsoft, Samsung, and others, which have invested billions of dollars in consumer VR. But at Oculus Research, grand ambitions seem to trump immediate bottom-line concerns.
“VR and AR will together change our lives as fundamentally as personal computers and smartphones have, and quite possibly even more,” Abrash wrote in his post. “Over the last 40 years, personal computers, smartphones, and tablets have given us constant, near-instantaneous access to the digital world through 2D screens, in the process touching almost every aspect of our lives. Over the next 40 years, AR and VR will allow us to actually live in a mix of the real and virtual worlds, and this will once again radically change the way we work, play, and communicate.”
It’s hard to imagine waiting around 40 years for the future Abrash is picturing, and in all likelihood, things will move a lot faster than that. Still, it’s worth hearing about some of the world-changing devices that he and his team are imagining.
The future of technology promises small, lightweight, comfortable, powerful devices that let us do just about anything–replacing everyday devices like TVs, phones, computers, tablets, game consoles, and more, with instantly upgradable smart glasses that let us do what we want no matter where we are.
They’ll let us see in low light, hear perfectly no matter how noisy it is outside, quickly recall people’s names, find the fastest way to where we’re going, and much more. In short, they’ll give us all the information we could ever want, making us better than the best Jeopardy player in the world.
You read words like that from Abrash and you want to rush to the nearest big-box store to buy a pair of those AR glasses. Alas, he continues, “that’s not possible today, because the technology isn’t there yet; we’re working on it, but right now bottling the magic of AR is just an aspiration.”
That’s why Oculus is putting its actual product development energies into VR today. AR, though, is on the horizon, and Oculus Research is hard at work trying to be the ones that crack the code.
That’s why Oculus Research has assembled what Abrash says is “one of the best optics teams in the world, equipped with facilities that enable them to push the state of art across a wide variety of technologies,” including “waveguides–flat pieces of glass or plastic that light can be injected into so that it bounces along lengthwise and eventually deflects out and into the pupil.”
There are countless computing problems to solve in order to achieve this potential, though, but Abrash’s team is on the job, he writes, with people like perceptual scientist Marina Zannoli and optical scientist Yusufu Sulai. The two spent a year working together, and have designed what Abrash says is the first tool for “probing the limits of the human visual system.” It’s now built, operating as planned, and ready for experiments.
The UI Of The Future
Even if AR’s optical issues are solved, you still need a world-class user interface before anyone would want to actually, you know, wear a pair of smart glasses.
That’s where Oculus researcher Sean Keller’s work comes into play, Abrash says. “You’re going to have to be able to interact with your AR glasses in all the contexts you encounter in your day, so the interface will have to be multimodal,” he writes–controlled with our hands and our voices. The trick is developing systems that adapt to our environments and give us the right control at the right time. “No one mode can meet all the needs,” he writes, “and the challenge is to design an interface that can switch seamlessly between them and decide which to use at any given moment.”
Abrash has been working at the edges of technology and game design for decades. He’s worked at Microsoft, id Software, Valve, and elsewhere. It’s easy to imagine that in many cases, he’s envisioned technology’s direction and seen its potential long before most others.
Yet he says it took Facebook founder and CEO Mark Zuckerberg’s urging to see that AR was something worthy of his attention, something he more or less denied previously. “That earned me a look of disbelief that was useful incentive to think a lot harder about AR’s potential,” he writes. “Three years later, I’m fully convinced that we’ll all be wearing AR glasses one of these years.”
But it’s not going to happen overnight, and it’s not going to be easy to get there. And that’s the challenge Abrash says he and his colleagues at Oculus Research have eagerly accepted. He’s talked in the past about the social constraints, as well as the technological ones. But solving the problems means overcoming limitations that can’t rely on Moore’s Law. “They must be completely socially acceptable–in fact, they need to be stylish,” he argues. “They need an entirely new user interface. Finally, all the rendering, display, audio, computer vision, communication, and interaction functionality needed to support virtual objects, telepresence, and perceptual/mental superpowers must come together in a system that operates within the above constraints.”
Abrash is not the kind of researcher who takes on challenges he doesn’t think can be solved–or at least that’s his argument. So he firmly believes Oculus, or someone else, can get us there. In fact, he thinks we’re within 10 years of true consumer-ready, fully functional AR glasses.
The question all of us must ask is: Are we willing to go along for this ride with them?