This Professor Is Learning To Identify Bugs By Their Buzz. Can It Help Eradicate Malaria?

A sensor-based big data project backed by the Gates Foundation is tackling the tricky subject of forecasting insect densities–and the plagues that come with them. Here’s how it could save lives.

This Professor Is Learning To Identify Bugs By Their Buzz. Can It Help Eradicate Malaria?
[Image: Flickr user photochem_PA]

A combination of hunger and disease kills millions of people each year in the developing world–especially diseases like malaria. So an ambitious project to kill both epidemics with one panacea–like the one at University of California, Riverside–could change the lives of tens of millions of people.


Using cutting edge machine techniques and inexpensive sensors, a small team of researchers led by Professor Eamonn Keogh thinks a solution lies in the sounds insects make in flight. He and his team are applying machine learning algorithms to recordings of insect noises in the hope that they can audibly identify an anodyne bug from a malicious one. Harebrained? Not to the Gates Foundation, which has shown interest to the tune of a $100,000 prize awarded as part of Grand Challenges in Global Health initiative.

Here’s how Keogh is using that money to smartly combat insect-borne human suffering.

The Challenge Of Insects

“Insects rule this planet. As mammals we’re really more of an afterthought,” says Eamonn Keogh. Indeed, each year insect-vectored diseases like malaria kill more people than most wars while herbivorous bugs wreak havoc on their crops. Yet we also rely on insects to pollinate the majority of crop species that we eat, and to feed the birds which control other pests and support ecosystems–so it’s not as if we can simply kill them all. Being able to understand insect behavior–and sort the good insects from the bad–is therefore of utmost importance, both for eliminating insect-borne diseases and keeping people fed.

“A staggering one-sixth of the world’s total population–over a billion people–is malnourished, says Agenor Mafra-Neto, a chemical ecology researcher and CEO of ISCA Technologies, a company specializing in the development of semiochemical solutions for pest management, robotic smart traps, and nanosensors. “At least 6 million children die of hunger every year. One of the simplest answers to solving this problem is to efficiently grow and store more food at the local level where it is needed. By monitoring insects in real time this will allow for earlier, very targeted pest control that will ultimately improve food yields and save lives in impoverished rural areas.”

For years people have tried to keep tabs on insect populations, often using low-fi and inaccurate measurement tools. For instance, the most popular measurement tool is the sticky trap, a piece of cardboard coated in a sticky layer of glue. The idea is that insects get stuck to the sticky trap, and they can then be counted and used to make forecasts. The problem with sticky traps is that they’re inaccurate, and can take up to a week to deploy and process. When dealing with insects whose adult life-span may well play out over that time, a week is about six days too long.

What Would James Bond Do?

“I thought that if you would be able to gather this information in real time, the interventions you could stage would be much more accurate,” says Keogh. “This can have an enormous impact on how you respond. For example, if I was to tell you about a particular infestation now you might be able to go into a field with a hand spray, spray each corner of the field, and the problem’s gone. But if you had to wait one week [to be told the same information], you would have to bring in a helicopter or an airplane to blanket spray. On both a cost and an environmental basis, that difference is massive.”


Keogh’s idea relies on classifying insects by recording the sound that they make. Until recently, this methodology has been limited by the shortcomings of traditional audio recording devices.

“The problem with these devices is that insects don’t make a lot of noise,” Keogh says. “If you make [your recording device] more sensitive by having a higher gain, the moment a helicopter flies past or a dog barks in the next field it will swamp the recorder–and probably blow your eardrums off in the process.”

Keogh’s concept is instead based around a specially designed optical sensor that doesn’t record sound, but rather uses lasers to record “pseudo-sound.” If this all sounds like something right out of a James Bond movie, that’s because it is. When Keogh was a kid, he was struck by a scene from a spy movie he saw on TV. One character was spying on another by shining a laser onto the window of a room, where a secret conversation was being held. By measuring the vibrations the speech patterns caused in the glass window, the spy was able to translate the reflected light back into sound waves–thereby allowing him to listen to the conversation from a great distance.

“I remember thinking to myself as a kid, ‘What would happen if a bee flew past the laser?’” Keogh says. “That idea was in my head all those years. It was only later on, when I started working in this field, that I got a chance to use that insight.”

For his version of the optical trick he doesn’t rely on a reflective window, but rather on a photodiode which measures the amount of light emitted from a source. The received light has the sound of whatever passes through it embedded within it. Using this method, the sound of an insect can be recorded in crystal clear quality–even down to how many times it flaps its wings per second.

Unlike traditional audio recorders, Keogh’s method also means that nothing is recorded outside of the small recording plane of his sensors. “Anything outside of the sensor’s plane of interest won’t get picked,” he says. “You could have a machine gun going off and, if it’s outside of the area we’re focused on, it won’t show up on our recordings.”


Data Mining The World’s Insect Population

Of course, recording the sound isn’t everything–you also need to identify it. This is where the project’s software component comes into play. “We’re the first people in the world to do big data for insects,” says Keogh. “We’ve been able to train models to recognize insect sounds and flight patterns.”

This was more difficult than it initially sounds. One of the key researchers in this area is Yanping Chen, a PhD candidate in computer science and engineering. “One of my main tasks was to find a good classification algorithm that would let me identify the individual species of insect,” Chen says. “We spent a lot of time collecting the data–24 hours a day, for several years, recording multiple insects in parallel.” Once this was done it was her job to help create the machine-learning tools that would sort one insect from another. To give a sense of how tricky this task is, she compares it to the typical task of a smart email service which has to differentiate between only around four different types of message–including spam, and business and personal emails. With insects, on the other hand, there can be hundreds of thousands of potential classes.

There exist, for instance, 3,528 types of mosquito. Of these, a minuscule 3% cause problems for humans, while the rest are totally harmless or even beneficial. Having an algorithm that can simply tell a mosquito from a cicada isn’t enough–it also needs to say what kind of mosquito you’re dealing with. With that kind of granularity required, the team turned to the music world, where they looked at algorithms used to analyze compositions and recognize the individual instruments within them.

The data can then be used to make efficient recommendations for vaccines and possible interventions. Malaria, for example, has well over 100 possible inventions that can be staged–from handing out bed nets in a home, to spraying various types of pesticide. Almost all of these cost money, and Keogh’s work can help make these recommendations as accurate as possible. Unlike tests such as the sticky trap, this real-time data can provide minute-to-minute details about insect populations whose behavior can vary enormously depending on time of day or weather and environmental patterns.

“If you have bad information, it can be almost worse than no information,” he says. “That’s our aim: to provide pinpoint-accurate information that can be used.”

Where Do We Go From Here?

While at present, Keogh’s work has remained a laboratory investigation, he is now keen to roll it out to the field. Doing this meant modifying his experiments–including developing far cheaper sensors.


“We knew that if we had an expensive sensor, there was no way we could do this practically,” he says. “We want to have 100,000 or even 1 million of these sensors deployed in the field throughout Africa and Southeast Asia. You cannot do that with a $100 sensor since it would be prohibitively expensive and they would also likely get stolen and sold for parts,” he says.

His goal is to create a low-cost sensor for no more than $5–a goal he is getting closer to by the day. “When we started this project our first sensors cost $1,600–now we’ve got it down to around $10-15,” he says.

For both Keogh and the team working with him, it goes far beyond an academic computer science project.

“Most of the work you do when you’re studying computer science is just for publishing a particular paper,” says Yanping Chen. “Here I can see that the work had the potential to save lives. It gives me a sense of accomplishment to work on something like this. There will likely be more challenges as we deploy our research in the field, but I’m confident that we can overcome them, and that this will be a valuable tool for fighting malaria and other diseases.”

“This is going to be my life’s work,” says Keogh. “You always hope that you’ll be able to do something which can help people in a real-world sense–and to find a project like that is quite rare in computer science. I always wanted to try and find something that would leave a lasting impact. I can’t think of anything more important I could be trying to do than to save lives.”