Haig Beylerian, a musician and UX consultant, is wearing a helmet as he approaches the drum machine, because there’s no telling what will happen when you tap a drone to make a beat.
The touch-sensitive drone, built in a lab at the University of Toronto, was set up to send data to a computer running ROS (Robot Operating System), then translated to MIDI and sent to a MacBook running the music software Ableton Live 9 and visual programming language Max. The result is a hovering drum machine capable of boggling the mind–and giving some new meaning to the genre of “drone music.”
The drum experiment wasn’t just about building a newfangled instrument. It was designed to test how humans can interact with robots, specifically flying ones.
“Interaction based on physical contact has many unique benefits, but the implementation is not straightforward for flying robots,” writes the lead student on the project, Xingbo (Isaac) Wang, as part of his flying drum research paper.
Wang, a student in the University of Toronto Institute for Aerospace Studies, had been working on a separate project–a quadcopter’s ability to recover its flight pattern after interaction–but this summer came up with a way to combine his love for music with drone interaction.
The drone drum carries sensors including an accelerometer and gyroscope to determine applied force and acceleration. Physical contact is estimated from the force and torque created when the user, within a certain range of frequencies, taps and bumps it.
To avoid unintended vibrations not related to human interaction, filters were used to eliminate external vibrations. And to simplify the experiment, the quadcopter was in a constant height and position to minimize other variables.
Still, playing a drone, at least with good rhythm, isn’t simple.
“The timing for the performance was tricky. Unlike a keyboard or drum pad where there is a definitive point of touch and release, it’s harder to decide when a quadcopter sends the MIDI data,” says Beylerian, the musician in the video, and a consultant at Toronto-based software developer WaveDNA. (The company’s beat creation software, Liquid Rhythm, was used in the final demo.)
“Is it exactly when you touch it, when it moves the maximum distance it’s going to move based on that touch (to inform velocity), or something else?” he wonders.
For each tap, the software generates data: one MIDI note indicating which drum instrument to be played, duration of the note, and velocity of the note.
Velocity, or loudness, was calculated based on the magnitude of a user’s interaction. Duration of each note was set at a constant 0.25s, which is equal to a sixteenth note, for the sake of simplicity. For those interested getting into the weeds, Wang’s paper fully breaks down all the math and calculations he used.
The team encountered a few problems, says Wang, including latency, compared to other MIDI instruments. There was a delay of about 30-40ms between the interaction and response. For now at least, this would most likely keep the drone drum out of most drummers’ kits. (Though perhaps John Cage would like to have a go.)
Still, if we can learn how to interact with robots through touch and make music at the same time, the future sounds a little more interesting.