Face painting is a simple pleasure most of us gave up when we turned 7 — unless we ran off to join the circus. But singer Olga Bell is bringing it back 21st-century style in her new music video, which uses face-tracking algorithms to scan her facial movements while she sings and project interactive patterns back onto her face based on those movements. The effect feels simple and timeless, but also unavoidably futuristic at the same time.
The visualization system was created by Zach Lieberman, Francisco Zamorano, Andy Wallace, and Michelle Calabro. Lieberman is no stranger to doing weird-ass futurey stuff with faces — he’s collaborated on similar face-tracking generative art projects with Daito Manabe, who became semi-Internet famous for electrically shocking his face muscles in time to an electronic beat. Ms. Bell saw some research that Lieberman and Manabe were doing for a live performance and got in touch with them about doing a music video.
Luckily, there’s no electroshock involved. “The software is written in Openframeworks and uses a hacked Kinect to track the face,” Lieberman tells Co.Design. “The software analyzes the movement of the face and the sounds that the singer makes to adjust the visuals. We created about eight different looks that the singer can manipulate. Everything is happening in real-time — it’s like a new form of reactive face-painting.” The best part? No messy cleanup required.