Fast company logo
|
advertisement

The idea behind Jérôme Blanquet’s innovative film Alteration was for viewers to experience what it’s like to be an AI.

How Facebook Used AI To Make The Trippy Effects In This VR Film

BY Daniel Terdiman5 minute read

When Jérôme Blanquet set out to make his new virtual reality film, Alterationwhich, among other things portrays an artificial intelligence diving into a man’s dreams to steal them—he felt strongly that AI should play a big part in the project.

“As a filmmaker, I would like to represent…using AI in the dreams of a human,” Blanquet says. “For me, only AI can represent what AI can [represent] in the human brain.”

Alteration, which was launched yesterday for the Oculus Rift and Samsung’s Oculus-powered Gear VR, centers on Alexandro, who volunteers to take part in a dream experiment, unaware that the researchers running it are going to inject, digitize, and take over his subconscious with an AI in the guise of a woman named Elsa.

The story progresses through scenes representing Alexandro’s memories, the AI always hovering nearby. His wife, Nadia, who plays a central role in some of those memories, objects to Elsa’s presence, leading Alexandro to pull the plug on the experiment–with disastrous results.

Nadia is a painter, and Blanquet wanted to show the crumbling of Alexandro’s subconscious by blending it with the very style of the canvases on the wall of her studio, even as Nadia is slowly absorbed in his dreams and replaced by Elsa.

[Images: courtesy of Facebook]

Representing The Alteration

The filmmakers’ goal was always to represent that alteration visually and, originally, they thought they would do so using traditional post-production animation tools. But Blanquet knew that AI researchers at Facebook had developed an algorithm for style transfer–the very visual method for recomposing images in the style of other images–he wanted in the film, and wondered if it was possible to incorporate that algorithm into his VR film–“to make real style transfer, with real AI,” says the film’s producer, Antoine Cayrol.

“Elsa wanted to take the place of [Alexandro’s] wife, through the style of her paintings,” Blanquet said. “Style transfer is really what we thought makes sense when you look at what’s happening in the story.”

The film’s executive producer, Yelena Rachitsky, who works at Facebook-owned Oculus, which helped fund Alteration, explained that VR was a natural medium for applying style transfer, given that in virtual reality, there’s no limits imposed by reality. And by using the effect, the filmmakers were able to impart the kind of visceral feelings of what it’s like to be an AI, Rachitsky says.

“It’s an interesting approach to go through that,” she says, “and it wouldn’t have the same impact if it was a traditional film, because you’re not immersed in it. Things shift and change, but then it kind of captures you all around. It just changes your feeling of what you experience in the space.”

Like A Hackathon

In the meantime, the folks in the Paris office of Facebook’s AI Research (FAIR) lab were eager to help out, especially since they came aboard before shooting started. Applying style transfer to a VR film was likely something no one had done before, so they wanted to find out if AI-generated imagery could help establish the story in an immersive VR film.

One of the best parts of working on Alteration was the brainstorming sessions before filming began, recalls FAIR research scientist Antoine Bordes, since those conversations influenced the direction of the production. “From the beginning of the project, we were involved,” he adds, “showing them what AI was capable of doing.”

advertisement

Befitting Facebook’s involvement, those brainstorming sessions were akin to a hackathon, recalls research scientist Piotr Bojanowski, since there was no guarantee that the filmmakers would be able to incorporate the still-experimental cutting-edge algorithms that came out of the research lab.

One particular challenge, recalls Bordes, is that the style-transfer effects FAIR had developed were known to work on content for mobile phones, and for 4K video, but no one knew if it could be scaled to work on fully immersive scenes like those required for a VR film. Even if it could be scaled, Bordes says, “would it look good enough for [Blanquet] to put in his movie?”

The Challenge Of The Unknown

For the folks at FAIR, it was important that the answer was unknown. After all, what’s the fun in going into a project already knowing exactly how it’ll turn out? “If we knew at first” it would work, says Bordes, “we wouldn’t have done it….Usually, when we try to pick these projects, [we ask] are we going to learn something” about AI.

In this case, the unknown was whether or not style transfer could be applied to a VR project, in which the visual effect must be scaled in 360 degrees, and at high quality.

For those who watch Alteration, the answer is a pretty clear yes. Although the technique didn’t end up working for each of the 14 paintings by French artist Julien Drevell–as well as three corrupted noise images–that Blanquet and Cayrol brought to FAIR to experiment with, the final product demonstrates that for those that did work, the effect is striking. It results in a seamless blending of pictures on the wall with the entire visual dynamic, gradually subsuming every bit of what’s in front of the camera in all 360 degrees, including the characters. When a stylized Elsa appears above Alexandro’s bed, you realize the AI has taken over, in every sense of the word.

[Images: courtesy of Facebook]Having succeeded in applying style transfer to a VR film, FAIR’s scientists will move on to other projects. But they get the satisfaction of knowing they’ve proved the method can work for VR production. In fact, Bordes says, FAIR may well open-source the algorithm it created, in line with Facebook’s common practice of publishing its research.

In order to achieve the desired effect, the FAIR team had to train a series of neural networks that would be utilized for altering the film’s frames. The team had to scale the style transfer process for professional-quality VR, which meant 4K resolution, as well as 360-degree and stereo video, the FAIR team wrote in a blog post about their work on the film. They also had to train their networks on larger-scale images than originally planned because of the immersive nature of VR. It took over 120,000 hours on powerful computers to train the models, not to mention hundreds more hours on even more powerful machines to apply the models to the actual movie frames.

“It was a big bet for the team,” says Cayrol, “because the process was really original. We were sending the files to FAIR, and they processed them. Then they would send us examples of what came back. It was like opening your Christmas gift.”

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Daniel Terdiman is a San Francisco-based technology journalist with nearly 20 years of experience. A veteran of CNET and VentureBeat, Daniel has also written for Wired, The New York Times, Time, and many other publications More


Explore Topics