Tom “Squarepusher” Jenkinson talks more like an engineer than most engineers do, and Pitchfork has described his sound as angry-jazz-droids-run-amok. So when Jenkinson was asked to compose for a group of robot musicians called Z-Machines, he wasn’t about to say no.
“Part of what interests me is when we listen to a robot, do we listen to it as if we’re listening to a human?” says Jenkinson. “I wasn’t trying to make it emulate a human being, but I was trying to make it do something which I wanted to hear. Now the question remains, is the thing which I want to hear a human being?”
Z-Machines were created at the University of Tokyo by CGI artist Yoichiro Kawaguchi, robotics engineer Naofumi Yonetsuka, and media artist Kenjiro Matsuo. Their robots have musical superpowers. The guitarist, Mach, plays two guitars with the aid of 78 fingers and 12 picks. Cosmos triggers notes on his keyboard with lasers and drummer Ashura uses his six arms to wield 21 drumsticks.
“There were certain points within this set of compositions where there can be no argument that it’s trying to emulate a human performance, “ observes Jenkinson, “because there is simply no human performer that could do it.”
Jenkinson’s first goal was to see if he could create emotionally engaging music played by robots. Others have tried. “The initial recordings I had back didn’t have me brimming with confidence,” he says. ”Figuratively, it sounded mechanical. It sounded plodding, uninteresting.”
The robots were in Japan while Jenkinson was in the U.K. so he would compose new material, transfer the data, the robots would play it and he would receive the recordings back. Like any good engineer, Jenkinson first investigated the capabilities of his distant virtuosos. “How fast they can play? “ he says. “Are there limits like does it start to struggle beyond playing certain numbers of notes at the same time? Are there timing idiosyncrasies when you start to push towards the limit of what it can do?”
The robot musicians had some serious limitations in comparison to human performers. The guitarist can play 125 notes a second, extremely fast for a human player, but each of its guitars have only half the frets of a standard instrument. Every note was played at exactly the same volume so there was no capacity for dynamics or individual string vibrato.
When a human guitarist plays a fast melodic run some notes will run together, particularly when crossing strings, and other notes may be very short or hardly sound at all. The robots, in contrast, played each note exactly on the beat and for the same amount of time.
Having established their engineering tolerances, Jenkinson started to play the robots’ limitations by overloading them with data. For example, once the drummer receives a command to play a note, the operation of swinging the stick towards the drum, hitting it, and swinging back to a rest position takes a fixed amount of time. If you deliberately send a new note before the drummer’s arm has returned to rest, it takes less time to hit the new note, introducing a timing discrepancy. “That was something that I initially found problematic, “ explains Jenkinson, “and then tried to turn it to my advantage to try and sort of generate a kind of a robot funk–playing with its idiosyncrasies to generate a sort of swing, a robot swing.”
He also harnessed his mechanical guitarist’s strengths. “I found its strongest capacity was in polyphony (multiple simultaneous lines of independent melody)” he says. “In respect of human performance, one of the strongest imitations is the interdependence of your fingers so whatever you are doing with one finger determines what you can do with the others. So if it’s on one string at a certain part of the neck, there’s only so far that you can reach to other frets and so far across the neck and the further you reach the harder it is to make it accurate. All of those considerations are completely by-the-by when it comes to this robot guitar player. ”
Jenkinson never met his robot musicians in the flesh, only in videos. I asked him how he rates them as live performers. “It’s probably a strange halfway house between staring at a drum machine and staring at a human performer,” he says. “You can see the music being made. In this case you can probably see more than you can see with a human being. You can actually see all these pistons moving, you can see the plectrums moving, you can see all the mechanisms which drive the performance. However, my take on it would be that actually it’s probably not quite as compelling as watching a human being because there’s no sense of struggle. It’s like watching an automatic typewriter or a washing machine.”
Then there’s the knotty question of how to assess the musical performance of a robot. “Certainly when I have listened to people’s appraisals of a particular musician’s performance, they’ll talk about ego or they’ll talk about feeling or they’ll talk about expression,” says Jenkinson. “All these things are human attributes so can we usefully employ those terms when we appraise robot performance? For example if we had some kind of guitar heroics, well, if it is robots doing it you can’t criticize it on the basis of it being egotistical because there’s no sweating, egomaniac playing it. However, they might say then that the criticism just falls back on me, the composer. There’s probably no way out in terms of that.”