advertisement
advertisement
advertisement

How Aphex Twin’s “T69 Collapse” video used a neural network for hallucinatory visuals

How Aphex Twin’s “T69 Collapse” video used a neural network for hallucinatory visuals

For the last few years, musician Aphex Twin’s visuals have been created by an equally elusive and hermetic artist–Nicky Smith, aka Weirdcore. Last summer, for instance, Weirdcore created dark yet hilarious visuals for Aphex Twin’s shows at Primavera Sound and Field Day Festival, which looked like the hallucinations of some corrupted artificial intelligence.

advertisement

This week, Aphex Twin and Weirdcore dropped another dose of cyberdelic visuals for the new single “T69 Collapse,” off the Collapse EP, due out September 14 on Warp Records. As with the live visuals, Weirdcore’s video explores shapes, colors, and textures through morphing, glitchy effects. But this time, the artist introduces various cityscapes and terrains into the mix, as if the viewer were experiencing a virtual reality collapsing into a black hole.

Weirdcore tells Fast Company that in order to pull off the video’s more refined visuals, he needed to use more advanced render engines, which forced him to switch over to a Windows workstation. His original task was to create something that, aesthetically speaking, lived in between Aphex Twin’s iconic “On” video, which features stop-motion animation on Cornwall, England, beaches, and Autechre’s “Gantz Graf” video, with its morphing 3D-animated machines. But he wanted to do so in a collaged, 3D-scan kind of way.

The video’s intro features lines of text being overtaken by error, some of which almost look like email messages between Aphex Twin and Weirdcore, discussing the music and video concept. Other bits of text range from code to foreign languages, all of which Weirdcore textures and overlays onto photogrammetry scans of Cornish streets and buildings, using After Effects’ UV and Position Pass.

“The first third part is all video processed in a videogrammetry way, followed by a 3D-scan collage of various places in Cornwall, scanned in a photogrammetry way, and then it’s using loads of 3D-scanned textures,” explains Weirdcore. “I was planning to use loads of LIDAR scans, too, but didn’t get around to it.”

At the 1:04 mark, the buildings and streets take on unreal virtual appearances–moving, oscillating, and flickering with various textures, shapes, and colors. Weirdcore pulled this off using Style Transfer, a technique that uses machine learning to blend two images or videos together to create a unique third (see: Google’s Deep Dream Generator).

“It’s Style Transfer techniques using Transfusion.AI over the Cornish photogrammetry collage,” says Weirdcore. “The original animation actually looked . . . low-end, but using several Style Transfer composites/layers in various ways really made a difference.”

Weirdcore also took a 3D scan of Aphex Twin’s face–warped in familiar visual fashion–and composited it into the landscape that is collapsing into a virtual black hole. Using Aphex Twin’s audio stems, MIDI files, and BPM changing data, Weirdcore was able to sync the visuals to the music, creating an audiovisual experience that really warps the mind when watching and listening.

“These render engines opened the door to different looks and technique,” says Weirdcore. “But I feel I’m still merely scratching the surface and just getting started with this new look. Definitely more and better of this look to follow.”

advertisement
advertisement