Fast company logo
|
advertisement

As someone who is working very closely on some of the behind-the-scenes elements of the metaverse, Riccitiello is in a unique position to see through the (considerable) hype to what’s really possible within the next few years.

Unity CEO John Riccitiello describes the realities—and distractions—of the metaverse

John Riccitiello, chief executive officer of Unity Technologies [Photos: David Paul Morris/Bloomberg via Getty Images; Richard Horvath/Unsplash]]

BY Mark Sullivanlong read

John Riccitiello sits atop a company, Unity Technologies, that provides perhaps the most widely used game development engine in the world. Unity is used to create games for mobile, consoles, PCs, and AR/VR—the latter of which, you may have heard, is a key component of the so-called metaverse. All of which is to say: Already Unity is a go-to tool for building metaverse experiences.

As someone who is working very closely on some of the behind-the-scenes elements of the metaverse, Riccitiello is in a unique position to see through the (considerable) hype to what’s really possible within the next few years. In early October, he spoke with Fast Company about separating signal from noise. This interview has been edited for length and clarity.

What do you think about all of the current excitement about the term “metaverse?” 

I read Neal’s book [Snow Crash, in which author Neal Stephenson coined the term “metaverse”] back in 1992. So I’m a bit of a science fiction nut, and I can remember when all the new game startups were named things like Black Swan, which is the name of the bar [in the book]. Most people can’t remember people like Hiro Protagonist [the book’s protagonist] being a pizza delivery guy living in a steel container, and what a burbclave is. But to me, that’s all fingertip understanding. 

The term means a lot to me, but I think it’s one of the most misused and abused, hyperinflated terms I’ve seen in a long time. I find it more than mildly irritating. The meaning’s gotten lost in the baubles around it.  

I’ve read so many things about the internet being about AR or VR, which I don’t think it is. Or it’s about avatars, but I certainly don’t think it is, although those things can and will be involved in certain circumstances. In our fixation on the baubles, we’ve lost an understanding of what it’s about. 

I gave a public definition, which is it’s the next version of the internet. Mostly the internet today is not real time—this will be real time. It’s mostly going to be 3D, not 2D. It’s mostly going to be interactive versus not. It’s mostly going to be persistent [i.e. digital objects will be anchored to specific places], which it’s not [currently]. It’s mostly going to be social, which it’s not. 

An internet destination that achieves all that. Roblox is one. But you could imagine thousands of them, and in fact you could imagine most content sites, most entertainment being delivered that way.  

Now you’re leading toward this concept where the metaverse starts to become something like an immersive 3D presentation of the internet—that is, it adopts a set of standards and protocols that allow you to take your identity and your digital goods with you as you move around. 

In my opinion, you know they don’t know what they’re talking about when they talk about taking your stuff from one place to another, and they don’t say things like USD [Universal Scene Description, described by some as the “HTML of the metaverse”]. 

If you listen to [Meta president of global affairs] Nick Clegg, they literally gave 8,000 words on the metaverse and he managed to come up with an example of “I want to wear the same T-shirt in two locations.”  

It’s really kind of hard to figure out what I want to bring from World of Warcraft into my Gucci shopping experience. It’s not so obvious there’s that many things they’re going to want to go from one place to another. On the other hand, there is no standard for data and rendering in this world. One standard for data and rendering that is really powerful in 2D is USD. That’s a format built by Pixar, but it’s not presently very good for rendering animation in 3D, which is what a lot of this is going to be. We and NVIDIA and Pixar and a few other companies are working on making that standard great. But it won’t give you interoperability exactly.  

And then the question is, for what purpose? There will be some [interoperability] done for sure, but to me, that’s not an underpinning glory to the metaverse. There’s so many incredible other things you can do, like digital twins, or like the kinds of simulation you’re never going to get [in the real world]. For example, a Prius behind a T-boning semi trailer on the Golden Gate Bridge on a rainy day. No autonomous driving software can incorporate that using real data at all. Not possible. We’re never going to record it. But I can simulate that 150,000 times on the roads of America in a real-time simulation environment inside of Unity, so they can train against it. 

That’s a metaverse experience. Now, you may not be experiencing it. Software is experiencing it, but it’s training against that. It’s not as sexy as showing Mark Zuckerberg as an avatar, but what I’m describing is going to happen. I don’t think the other stuff frankly matters much, and that’s why I get frustrated by the terminology of the metaverse. 

You said earlier that you think avatars have become somewhat of a distraction in our understanding of the metaverse. Can you explain that? 

The metaverse has gotten littered with avatars.  Like for example, if what you’re doing is trying on clothes, you absolutely want an avatar. For sure they want to be millimeter accurate. But if you want to check out a hotel room and look out the window and see if that’s the exact room you want, an avatar would just slow things down. And trying to check into a hotel room or whatever, there’s my avatar pulling my credit card. No, I just want to hit return and it’s done. 

Have a real time 3D sense, perhaps with an XR device, for, say, visiting the pyramids in Egypt. Having 400 people with their avatars in the space might be cute. But I kind of just want to follow the narrator that’s taking me and showing me this and my body disappears in that world.  

What are some of the other more immediate use cases we’ll see for immersive digital spaces? 

We have customers right now like the Vancouver Airport. They control the airport by looking at a full 3D model of the airport that sees surging surges: the traffic related to planes landing and security issues and how many people bought something at the tax-free store. But they could also run simulations off to the side, saying, “Well, this plane is 20 minutes late and that plane’s 20 minutes early. What does that do to my security lines?” That’s real-time 3D, and you can consume it on a tablet. You consume it on a PC. You consume it with an AR/VR device. They’re all great, and they’re great for different things. 

You can also have a better shopping site. We’re working with a lot of high-end retailers right now around being able to get high-end brands (such as Gucci). So you can do virtual try on, so that the room that you’re in looks like you’re in their boutique. 

Do you think Wall Street has a proper appreciation for the non-gaming side of the business, meaning digital twins, AI training data, things like that? Given that, do you think potential growth of that side of the business is fairly represented in the stock? 

First off, it’s a fool’s errand to sort of comment on the market. The second point I would make to you is you can look at companies that went public when we did–from Snowflake to DocuSign and others–and we’re trading about the same way they are, relative to their IPO prices. So we’re not actually a standout. 

advertisement

The third thing is, I was hanging out with a guy that I had never met before two weekends ago who runs the wealth practice for one of the big three banks. And so all he does is look at macro stuff regularly on stocks. He advises billionaires only. He was telling me, you don’t realize this but we’ve actually had a harder market impact than 1929. It’s the worst we’ve seen in over 100 years, and so I hadn’t really fully ingested that, he said. What’s also different about this one is that the bond market is down, too, which didn’t happen in 1929. So everything’s cratered. That happens only 4% of the time in a recession, when they both go down. 

Obviously the Feds have been raising interest rates like crazy, which in and of itself is wiping out the European economy because interest rates are so high here. The dollar is going through the roof. All of this doesn’t seem very sustainable.  

I was able to spend some time with Epic Games last spring, and I was told back then that they’re your competition with Unreal Engine. I was told that Unity’s gaming engine is used to create most mobile games, while Unreal Engine is most often used to create high-end console or PC games. Is that how you would characterize it?  

On mobile, we’re 72%. They’re less than one. So the mobile part’s right. On console, we were nowhere and they’ve always had a position there. We’re bigger than they are in Nintendo. We’re about the same size on console, but they do get some of the higher-end games. But they don’t get many of those; the vast majority of truly high-end games are built by custom engines.

And, so, if you can take a snapshot and say these things are true–they’re more console and PC than we are, that’s because we’re giant on mobile and we’re as big or bigger than that on console and PC now. 

I don’t take anything away from them. They make Fortnite. I mean, for god’s sake, it’s huge and successful. And I like [Epic Games founder] Tim [Sweeney]; Epic makes some really great games. And, so, I would say people make too much of the competition between Unity and Unreal. 

I look at the technical challenges of building a pair of mixed reality glasses that people want to wear. Do you worry about how quickly we’re going to have the hardware to enable some of these metaverse experiences? 

I was a near founding investor in Oculus. The presentation to raise money for it was partly written in my kitchen with Brendan [Iribe, the cofounder and former CEO of Oculus] there. So it’s an area I’ve been tracking for a while. Remember, 70% of things built in AR/VR are built in Unity. The companies that are making platforms need to talk to us, because when they ship hardware, they need to be Unity-compliant long before [launch], or there won’t be any software on their platform.  

We’re in a great position to know what’s coming, and I would tell you that I am a massive believer that by the end of this decade, AR/VR will be a normal part of our lives. Will it be as big as mobile phones? I don’t think so. Will it be as big or bigger than consoles? Probably. 

So think about consoles as being a big niche, where mobile devices are universal mass market. So [AR/VR] in that time frame will be a very big niche in my opinion. I wish I could give you all the reasons I believe that, but I can’t give them all to you.  

Just thinking about next year, are there going to be mixed reality products that you think are going to be attractive to mainstream customers? Or is it still too early to be thinking about that kind of thing? 

To be a mainstream mass-market item, it needs to achieve a certain critical mass in terms of hardware, meaning maybe a 50 million installed base would be the low end for a console.  

For example, before anybody can really economically make content for it, it needs to have a thriving content ecosystem.  

And it needs to just work: How many times have we tried these things? How do I set this up? You know, tiling the room and then it loses something on the RAM and you’re starting over again and that kind of thing. And there’s a handful of other criteria that I put out there. 

We are within a handful of years of all these things happening, and so at a very high price in the near term, we can get a very cool thing. But the very high price giving you a very cool thing doesn’t sell them to enough people to allow the creation of an ecosystem. 

I would say, without me sort of traipsing out stuff I shouldn’t talk about, it’s a really good time to train your attention on this space. All you have to do is look at the big companies’ patents that are being filed. Look at their job postings. They tell you a lot. There’s a lot going on in this space, and I think it’s an exciting time, and it would shock me to get to the end of the decade without the technology being pervasive, meaning hundreds of millions of people using these.  

Here’s what you’ve got to remember: 5G is broadband; 6G will be here by the end of the decade, and 6G is ridiculous, or will be–no one’s defined it exactly. So we can move the data around now in ways we couldn’t move data around before. These guys at NVIDIA and M chips from Apple, the AMD advancements—I mean, it’s crazy what we’ll be able to do from a technology implementation perspective. So it’s a really good time to be focused. 

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics