ChewIt, a lozenge-size, wireless “intraoral interface,” could offer a new way for people who can’t use their limbs to control their personal technology.
ChewIt creator Pablo Gallego Cascón, a graduate student in the University of Auckland’s Augmented Human Lab, wanted to prototype a piece of assistive technology that “doesn’t draw the attention of others and doesn’t make [the user] feel weird.” A paralyzed person might control a wheelchair by blowing or sipping air through a straw mounted near the face, but “these interfaces are not as discreet and natural as they could be,” says Cascón. ChewIt, about the size of a large breath mint, remains undetectable to anyone other than the person using it.
In addition to functioning as a physical button when bitten, ChewIt’s semisoft exterior encases a tiny accelerometer and gyroscope that register how the device is moved within the mouth. These oral “gestures” can be programmed to act as input commands: Flipping ChewIt over inside your mouth could tell a motorized wheelchair to start moving, “and you could turn your head to control its direction,” says Cascón, adding that he is refining designs to reduce any potential choking hazards.
Cascón’s team plans to test ChewIt with real-world wheelchair users this year and develop it as a real product if feedback is positive. But ChewIt could also be helpful for nonimpaired users, says Suranga Nanayakkara, who heads the Augmented Human Lab. Employing it as a virtual-reality controller that changes the view based on head movements and biting interaction and eliminates bulky plastic handsets, Nanayakkara says, “could allow AR and VR to become really mobile.”
A version of this article appeared in the October 2019 issue of Fast Company magazine.