zSpace: A Real Holographic Display Worthy Of Iron Man

And you can buy it today!


We’ve all been envious of Iron Man. Not just the suit–that’s fine and all–but what really blows my mind are his amazing drafting stations, complete with interactive 3-D wireframe projections.


Last week, I spent about an hour playing with a system called zSpace, a new display by Infinite Z that makes these sorts of Iron Man interactions a reality. And while I walked into the meeting a skeptic, I walked out realizing that these holographic interfaces aren’t just the future; if you have $6,000 to purchase your own, they’re today.

zSpace looks like a supersized tablet, a Wacom that mated with an ergonomic breakfast tray that’s eaten a few too many helpings of biscuits and gravy. Inside the frame sits a 24-inch 3-D display. When you wear passive polarized lenses (the same glasses you wear at the movie theater), each eye sees 60 slightly different frames of content per second, creating the illusion of three dimensions.

So far, that’s just like most 3-D displays on the market today. The innovation happens in that there are dots (tracking points) embedded in the glasses frames to follow the orientation of your face to the screen. zSpace uses head tracking to dynamically tweak the image you see in relationship to your view. If you’ve never seen a 3-D display with head tracking before, take a look at this video. It’s not just a hokey gee-whiz trick; head tracking allows you to explore flat LCDs from whole other perspectives, as if you’re looking around virtual objects just as you would physical ones.

Secondly, its main interaction tool is a laser pen–literally, the same solution as Iron Man movie concepts. (It’s tracked by the tablet by more IR cameras). It emits a beam that can land on these 3-D objects, and you drag them around. In reality, though, it’s just a three-button stylus, essentially a plastic pen with no laser inside. The LCD actually creates the illusion of the beam for UI purposes.

“The pen is tracked by the system so that its position and orientation are reported to the application before each frame is constructed by the rendering system on the PC,” explains Dave Chavez, VP of R&D at InfiniteZ. “Given the stylus information along with the user’s eye positions and orientation, there is sufficient information to allow, from the user’s perspective, the stylus to directly interact with the scene.”


So far, so good. It all made sense on paper: IR sensors (like Kinect) and tracking dots in 3-D space (Playstation Move). But to do it at 120 frames per second? To do it naturally? I wasn’t so sure zSpace could pull off all these incredible feats. After all, the tablet has been on the market since April, and pretty much no one has written about the thing.

That changed the moment I sat down to play with it.

The zSpace glasses are weightless on your face. And upon putting them on, a fuzzy display transforms into a collection of 3-D models protruding from the screen in front of me. I moved my head to the left, my view shifts ever so slightly. I can see a bit more of the ogre sitting on the table. I duck my head down, trying to look under the display’s frame–to content beyond the screen–and it’s there. It’s like a whole little world lurking beneath the surface. There’s no discernible end to the content.

“Grab the pen,” a voice tells me.

I pick up the stylus, which is light and comfortable, and fire it into the screen. The responsiveness is perfect. I can pinpoint exactly what I want to touch through the slightest shift of my wrist. And the laser–or I should say that illusion of a laser created by the screen and head tracking–is remarkably believable. (Truth be told, there’s the slightest break between the stylus tip and the beam itself at times, but for some reason, your brain just writes this off into the periphery.) The pointer always goes where you aim with surgical precision.


A wristwatch sits on the table. Clicking a button, just like a mouse, lets me drag and drop the items in 3-D space. Wherever I let them go, they freeze. So I highlight the watch–a slick wireframe lattice envelops its body for a moment–then I drag the watch about a foot from my nose just to get a better look. It works perfectly. I rotate the watch, just checking it out. It’s a comfortable feeling akin to wielding chopsticks, and I can even read an insignia under the face.

Another part of the demo shows me a disassembled pocket watch. I quickly begin to grab its tiny parts one by one and stack them like a house of cards. The precision is unreal. Whereas my ham-hock hands would fumble with such miniscule pieces, my pointer, which just moments ago felt like chopsticks, now feels like tweezers. I make a mini tower of clock parts impossible in the real world, within a matter of moments. I make a realization: The stylus is genius, not just in its control but in that it separates my fingers from the holographic illusion. The fact that I can’t (fail to) touch these parts actually enhances the lie that they’re real.

Yet the demos go on. InifiniteZ has several proof-of- concept interaction models built. I play a 3-D maze. I disassemble of model of lungs. I explore the layers of a model house, exposing its wooden frame, through an elevator-like slider. (An amazing implementation for CAD.) I can grab a tiny camera and place it in the house, exploring it first-hand (a subwindow shows me this view). How many hours had I wasted in After Effects screwing with virtual lights and cameras? This was … instantaneous. It worked perfectly.

Of course, the stylus is hiding more tricks than dragging and dropping. I click an icon to put it into write mode, which lets me draw in parabolic arcs. Except these arcs aren’t in 2-D. So what I build is a complex, winding sculpture that resembles a roller coaster. And again, it sticks out from the screen. I can look around it. This feels remarkably tangible. Now I drag that little camera onto my makeshift coaster, and it becomes a vehicle on a track, guiding me through a first-person animation of what I just drew. For virtual direction, this sort of tool will become a must-have.

All I can think, sitting there, playing with zSpace is, I want one of these. But my precise, second thought is, But what would I do with it? Most certainly, it’s a tool designed for pros. Anyone modeling or editing in 3-D would, without a doubt, benefit from this holographic perspective. The problem is, zSpace’s supported software list is pretty low at the moment. Maya is on the list–a big-name program for sure–but you’ll find little else and nothing from Adobe. InfiniteZ has built a pretty remarkable piece of technology, but it’s a moot point unless it supports the software that professionals use.


It also needs a very powerful computer to run. My Macbook Pro wouldn’t do (the “laptop” for the demo was really a “mobile workstation” I’d bet weighed at least 10 pounds). The top graphics cards by Nvidia and ATI are a necessity, as the computer crunches all of the rendering while zSpace handles all of the head and stylus tracking.

I wonder if InfiniteZ is making an earnest run at the market, though, or whether they’re expecting to be gobbled up by a larger company that wants the technology–or even just the patents. InfiniteZ has a patent on their interaction between the screen, glasses, and stylus–which, depending on just how encompassing it is and how popular such technology becomes, could be a pretty serious revenue generator.

But that’s boring corporate stuff. I didn’t take an appointment late on a Friday afternoon for boring corporate stuff. I took that appointment to test if zSpace worked, to see whether its pre-rendered marketing materials were just Photoshop hyperbole. Like much of the press, I’ve seen a lot of unreleased technology, but zSpace worked, and I wanted to play with it all afternoon.

So no, I don’t know if InfiniteZ will sell more than a hundred zSpaces before they’re acquired or edged out of the market. But I do know, in a not-so-distant future, we’ll all be using tools that look a lot like its tablet. That 3-D stylus? It’s here to stay, and it’ll make the mouse of today feel like some stoney caveman implement. Now only if we could get MIT’s levitating system to play nicely with this.

Buy it here.

About the author

Mark Wilson is a senior writer at Fast Company who has written about design, technology, and culture for almost 15 years. His work has appeared at Gizmodo, Kotaku, PopMech, PopSci, Esquire, American Photo and Lucky Peach