It started when Wenyao Xu was sitting in a meeting and someone behind him started eating chips, loudly. Xu, a computer science professor at the State University of New York at Buffalo, was inspired rather than annoyed. He realized that if he could recognize what someone was eating without looking, so could a computer.
Working with a team of researchers, Xu designed AutoDietary, a necklace-like wearable that uses the sound of someone chewing and swallowing food to recognize what they’re eating. The goal is to have a simple and convenient way to track someone’s nutrition over time.
“It is unobtrusive to users,” says Xu. “Users don’t need to do anything, and then their daily diets are recorded.” Common methods to track diet–such as food diary apps–require self-reporting, something that’s both more of a pain and less likely to be accurate. Even more advanced tech, such as food scanners that automatically recognize what you’re eating, are cumbersome and require effort to use throughout the day.
“Wearing a necklace is as easy as wearing a FitBit wristband,” says Xu, who tested a prototype with users in a recent study.
AutoDietary uses sensors to pick up the sounds of biting, chewing, and swallowing, and then sends the data via Bluetooth to a smartphone app that can analyze it and record each food or drink. Xu’s team is building a library of sounds for the system to use, and a machine-learning algorithm adapts to each user’s slightly different chewing sounds.
Here’s the sound of cookies:
There are challenges; while chewing an apple sounds clearly different from a carrot, other foods are harder to distinguish. Ice cream sounds similar to oatmeal, Xu says, even though the nutritional value is obviously very different. Right now, the wearable can recognize foods with around 84% accuracy, and the researchers are improving that by adding to the food library.
“Our team will push the technology to the marketplace as soon as possible,” says Xu.