How the iPhone’s next 3D camera signals phase two in the AR race

An upcoming rear 3D depth camera will open up lots of feature possibilities from AR to social media to gaming.

How the iPhone’s next 3D camera signals phase two in the AR race
[Photo: Devon Janse van Rensburg/Unsplash]

Some of the hottest and upgrade-inducing features of this year’s new iPhones might come from the cameras, especially the 3D depth cameras.


Apple already put a 3D depth camera on the front of the iPhone starting with the iPhone X, but so far its applications have been relatively limited (facial recognition phone unlock, Anamoji, and selfie portrait mode). Apple will likely try to get more uses out of the front-facing depth camera, but it’s the rear-facing depth camera that’s likely to show up on an iPhone this year that could be a game-changer.

Apple has been working on a rear- or “world-facing” 3D depth camera for at least since the iPhone X development cycle in 2017, and now Bloomberg’s Mark Gurman and Debby Wu cite sources saying Apple will ship the component in an iPhone this year. My own conversations with supplier sources left me with that strong impression, too. (Apple’s chief smartphone rival, Samsung, announced a phone this week–the S10 5G–that also has a rear-facing depth camera.)

A recent presentation given at the Photonics West conference by Lumentum (the supplier of the laser in the iPhone’s True Depth camera system) showed some of the groovy 3D-sensing features that might become possible in future iPhones.

Better “Portrait” mode. First of all, a rear-facing depth camera would assist the wide angle and telephoto lenses in making the bokeh effect in Portrait mode look a lot better. The results might look more like the background blurring effect created by expensive 35mm cameras.

AR will start to matter. Apple is excited about augmented reality (AR) and got an early start on it by releasing to developers its ARKit development framework in 2017. So far ARKit apps have had to rely on the iPhone’s 2D cameras to place digital objects within real spaces (as seen through the iPhone camera). Software is used to estimate the relative distances of objects from the cameras. The results have been good but not great.


[Below, an example of AR using Magic Leap’s system]

“Apple put a lot of focus on ARKit at launch, but, after an initial excitement, we have not seen blockbusters apps coming out of developers,” said Creative Strategies analyst Carolina Milanesi. “I think adding a rear-sensing camera will make the whole AR proposition much more interesting.”

An iPhone with a rear-facing 3D depth camera could more precisely “map” the 3D space in front of it, so that it’s possible to place digital graphics and information within that environment in ways that are convincing to the eye. Objects could be placed on more types of surfaces within an environment, for example.

How it works, and why

The depth camera system does this by using a laser to send out thousands of tiny dots of light, then measures the time each takes to bounce off objects in the room and return to a sensor. Light that returns from objects near the phone will have a shorter “time of flight,” while light returning from objects farther away will have a greater time of flight. The light emitted by the iPhone’s front-facing depth camera can travel only a few feet, but the laser used in the rear-facing system will have a much longer range.

The end result could be surprisingly realistic AR experiences. Through the smartphone screen you might see overlaid upon the real world a detailed 3D map of a city block or a mountain range, complete with labels and information that can be clicked and expanded.


ARKit developers will like this a lot, although it may take awhile for them to fully realize the possibilities, as IDC analyst Tom Mainelli points out.

“The challenge right now is that we haven’t seen many mainstream AR app success stories yet,” Mainelli says. “We may not see developers truly grasp what they’re capable of creating in AR until they get their hands on the new hardware (and the SDKs that will let them leverage the new capabilities),” Mainelli says in an email.

“So while I’m hopeful that we see the Mobile AR needle begin to move in 2019,” he adds, “it may take another year before these hardware advances manifest in apps and services that might cause mainstream buyers to consider it a reason to upgrade early.”

What could change

Gaming. It’s possible that the first killer app using world-facing 3D mapping will be a gaming app. Pokemon Go is the best-known example of mobile AR gaming. The game placed digital characters within the real-world environment of the player. But since it relied on 2D phone cameras, the placement of the objects was simplistic.

3D mapping will let the game developer more fully incorporate the real world into the setting of the game. Characters might interact with the real-world environment in more complex ways. You might see game characters crawling around on, leaping from, or hiding behind real-world surfaces. If it’s really good, you might start to forget they are just digital.


User-generated content. One of the biggest challenges in augmented reality today is creating the content–the 3D digital objects–to place within environments. In commercial applications, like the Ikea app where you place products within real-world rooms, the 3D product shots come from the app maker. They’re expensive to produce–often shot by numerous cameras from a variety of angles. Depth cameras will let users start creating their own 3D content by using their phones to do scans of objects or people, which could then be placed in virtual spaces. You might receive a 3D video from a friend that you watch like a hologram in your living room.

Behavioral analysis. The depth cameras are already surprisingly detailed and accurate, and getting more so. The resolution of front-facing depth cameras on existing smartphones is already good enough to detect facial expressions, eye movement, and even the heart rate of the user. When these signals are combined with data on what the user is seeing on the screen at the time, you get a form of feedback that would be invaluable to content publishers.

It’s not hard to imagine using behavioral feedback to inform advertisers or e-commerce sites of users’ emotional reactions to their brand messaging or products. It might help companies like Google and Amazon show users’ content and products–and of course ads–that are more relevant and less annoying. (Ideally, the user would be allowed to opt-in to that kind of behavioral targeting, as opposed to finding out about it later.)

2019 is supposed to be the year smartphones stop being boring, and the innovations will likely be more than just $2,000 foldable phones. This may be the year augmented reality might start to matter in the iPhone. With the addition of a rear-facing depth camera, Apple may add 3D camera features that are engaging and useful–enough so, the company’s investors hope, to induce people to buy new iPhones and upgrade from their old ones.

The rear-facing camera might also be what it takes to get ARKit developers really fired up about creating cool new apps. Many people I’ve talked to within AR circles agree with Mainelli and Milanesi that it’ll take a killer AR app of some kind to get consumers interested. To get there–imagine a Pokemon Go 2.0–Apple’s rear-facing 3D camera may be just what we need.

About the author

Fast Company Senior Writer Mark Sullivan covers emerging technology, politics, artificial intelligence, large tech companies, and misinformation. An award-winning San Francisco-based journalist, Sullivan's work has appeared in Wired, Al Jazeera, CNN, ABC News, CNET, and many others.