Meta’s New AR Glasses Let You Touch The Virtual World

The $949 headset, which is shipping now, lets you interact with digital objects without the need for physical controllers.

Meta’s New AR Glasses Let You Touch The Virtual World

It takes automakers many years to go from design concept to showroom-ready cars, and by the time designers see their new vehicles in the flesh, it’s often too late to make any major changes. What if there were a way to upend that dynamic, making it possible for designers to see and interact with full-scale digital versions of their works in progress?


That’s essentially the premise behind the new Meta 2 development kit augmented-reality headset. The system, which has finally begun shipping after many months of prototyping, offers a 90-degree field of view in a high-resolution display, and works by plugging into a laptop. That means it’s mobile—technically speaking—as long as you’re willing to move your laptop around where you want to use your headset.

Meta 2 development kit[Photo: courtesy of Meta]

Aimed for the moment at developers (with the hope that they will quickly create many new AR apps for the platform), the Meta 2 is easy to use and offers compelling 3D holographic imagery that you can manipulate with your hands, walk around with in physical space, and even see inside of. That last feature could definitely come in handy in the case of our aforementioned car designers.

Pre-order pricing of the Meta 2 is $949 until the end of the year, after which it will likely rise above $1,000.

San Mateo, California-based Meta, which has about 130 employees and has raised $75 million in funding, has been showing the Meta 2 off for many months. But the system, which would seem to compete with both Microsoft’s HoloLens and Magic Leap’s as-yet-unseen AR headset, is now a full-blown product that customers are finally getting their hands on.

Is it fully polished? Not at all. In the demo of the shipping version of the headset I saw yesterday, imagery was still somewhat janky—moving around when I didn’t want it to—and not quite as interactive as it should have been.


At the same time, that’s nit-picking, because I was able to move around digital objects, like a cube made of smaller cubes, or a globe, or even a 3D model of a human brain—all with my actual hands and all without the use of physical controllers. Virtual reality systems like the Oculus Rift (with Touch) or the HTC Vive offer wonderful 3D simulations with the ability to move things around with your hands, but those systems require handheld controllers.

Ryan Pamplin, Meta’s vice president of sales and partnerships, insisted to me that, for now, his company considers the HoloLens and Magic Leap to be effectively partners in “jointly convincing the world that the next paradigm of computing is augmented reality.” But there’s no doubt Meta is positioning its headset against the offerings of its rivals.

For example, Pamplin argued that the Meta 2 offers two times the field of view of the $3,000 HoloLens, and at a third of the price. That’s essential to the developer community, he said, which is “price-sensitive.”

Moreover, he argued that Microsoft’s motivation for the HoloLens is to “maintain market share,” while Magic Leap wants to “turn your world into Harry Potter.” But Meta, he says, aims to put a digital layer on top of the real world that is contextually relevant.

Achieving that goal has led to a “massive amount of pre-orders” for the Meta 2, he said, and an ecosystem of AR developers that numbers in the “tens of thousands.”


Look Inside The Brain

During a demo that I saw—and that Meta is planning to show at CES next month—one of the highlights was being able to literally look inside a 3D representation of a human brain and see all the neural activity going on inside. This kind of thing would be ideal for scientists, especially given that it’s possible to use your hands to grab, and expand, an object.

Similarly, I was able to look inside a three-dimensional car hologram, making it possible to inspect everything on its interior, down to the Google Maps directions on a screen the “driver” would see.

Pamplin said there isn’t a car company that hasn’t ordered a Meta 2, nor is there a 3D company, and that if you compared the list of Fortune 500 companies to Meta’s customer list, you’d find almost universal overlap. Other industries that are eagerly adopting the system include Hollywood, virtual reality, space exploration, and more. Companies that perhaps bought a single Meta 1 headset are buying 10 or more this time around, he said.

And why not? With the right apps, the headset does seem like it could make it more economical for car designers, for example, to see a full-size rendering of a new vehicle far earlier in the process than before—and therefore be able to make changes before it’s too late.

In fact, Pamplin said, he expects the headset’s initial popularity will lead to temporary shortages in 2017. By 2018, he added, Meta will be able to scale up its production into the hundreds of thousands of units.


“If we do it right, it’ll give us the ability to connect with the digital world and our environments,” he said, “in far more profound ways than anything has ever allowed us to do before.”

It’s no surprise that an executive of a company like Meta would say such a thing—especially one who expressed his enthusiasm for the product during our meeting by saying that his bosses “don’t know this, but they don’t have to pay me to work, I would just do it anyway.”

And there’s no doubt that the Meta 2 is a terrific device capable of doing things that few have seen before, especially at its price point. Still, it’s not perfect, and without great apps, it isn’t all that useful. Of course, that’s why Meta is aiming the device at developers for now, and why one would expect demos in the future to be packed with exciting experiences.

Ultimately, Meta is competing not with other AR devices, Pamplin argued, but with traditional 2D screens. “Every screen in the world is ultimately [our] competitor,” he says. The challenge now is making the Meta 2 useful enough so users will never be able to imagine going back.

About the author

Daniel Terdiman is a San Francisco-based technology journalist with nearly 20 years of experience. A veteran of CNET and VentureBeat, Daniel has also written for Wired, The New York Times, Time, and many other publications