One of the primary reasons to own a smartwatch is because it acts as a fitness tracker, monitoring your physical movement throughout the day. Most smartwatches are outfitted with accelerometers that measure how movement changes over time–currently, that means tracking whether you’re walking, running, cycling, or sleeping.
Now, two researchers at Carnegie Mellon University have found a way to use the existing accelerometer in an LG smartwatch to also measure what your hands are doing while you’re wearing it. The paper, which was presented at the annual ACM conference on Computer-Human Interaction, points to yet another way that electronics can learn about their users through a sensor that already exists in most of today’s wearables and smartphones.
In the study, the researchers tracked the hand movements of 50 participants who labeled what they were doing with their hands at regular intervals for nearly 1,000 hours to create a database of common hand movements. Then, they were able to devise an algorithm that can discern with 95.2% accuracy the sometimes extremely subtle differences between 25 common hand movements, including washing your hands, washing utensils, scrolling on your phone, using a remote, and typing.
To capture such fine distinctions between movements, the researchers put the accelerometers into a high-speed mode that provided them with more granular information, which included orientation of the hand, movement patterns, and even some bio-acoustic information, which consists of micro-vibrations that propagate up the wearer’s arm. Chris Harrison, the head of the Future Interfaces Group at Carnegie Mellon University and a co-author on the paper, says that this is almost like holding a stethoscope to your hand. A convolutional neural network, which is a type of machine learning algorithm, was able to find patterns using all this information and associate it with certain hand movements.
“This opens up a whole new world of high-fidelity activity recognition that wasn’t previously possible,” Harrison tells Fast Company via email. “We can, for example, track your typing to recommend (or enforce) breaks.”
Harrison points to other, similarly context-aware applications. For instance, your watch could track when and for how long you’re eating for an app that helps you track your calorie intake. Similarly, your watch could remind you to drink more water if it detects you haven’t been drinking very much on a given day.
While these applications open up new possibilities for use, they also raise privacy concerns. For instance, this research shows that if Apple, Samsung, or another smart watch maker were to implement a similar system, watch owners could be providing the manufacturer with a granular picture of what they’re doing most of the time, especially since our hand movements are intrinsically tied with our activity. As a result, paper co-author and Carnegie Mellon PhD student Gierad Laput believes that all the machine learning processing should happen on the device, without connecting to the cloud–and that consent would be vital.
The study shows how even simple and ubiquitous sensors like accelerometers could be tapped to provide yet more information about how people spend their time. And while this research remains in academia for now, there’s a good chance it will make its way into your smartwatch soon, especially given tech companies’ ravenous desire for data.