Over the past 40 years of human interaction design, computers have been honed to empower people with disabilities. Today, the iPhone’s touchscreen has a whole mode for people who are blind, while the Google Pixel can caption audio in real time for people who can’t hear. You can play Xbox with almost any physical impairment. And you can even drive a mouse clicker or wheelchair with nothing but the gaze of your eyes.
We’ve gotten pretty good at making the 2D interfaces on screens accessible to everyone. But how will people with disabilities handle augmented reality, full of interfaces that float in the 3D space of our real world? Better put, how does someone pick up a virtual box if they can’t lift a real one? If software is going to begin emulating real life, we need to make sure people with disabilities can still navigate through it.
Dots tries to do just this, as our 2020 Innovation by Design winner in the Student category. Developed by students at the Royal College of Art, Dots is a highly simplified, universal controller that enables people with varying physical disabilities to adeptly play and work in tangible, 3D interfaces.
Dots places two small accelerometer-loaded leads (aka dots) on someone’s body. It can be any spot from the chin to the tip of an amputated limb. As they wear a Microsoft Hololens headset, these two dots allow them to gesture however they’re comfortable, while controlling complex 3D holograms.
“If you look at 3D interactions, there are four fundamental movements: move, rotate, scale, and click,” says Valentin Weilun Gong, one of the students who developed the project. “If you can do those four things, you can essentially do anything in a 3D interface.” Dots enables them all.
To design the system, the students invited able-bodied people into their lab. “We’d recruit like 20 people and tell them you can’t use your [arms], and assign them two body parts randomly—like you can use your neck and your lap—and asked them to control 3D software,” says Gong. The test had no technology involved. The subjects were just miming movements. But through those tests, the researchers began to recognize patterns. For instance, if people wanted to make something bigger, they’d often make their bodies bigger, opening their mouth or parting their legs.
“We realized there’s really some fundamental logic,” says Gong. “So at that time we had a rough idea, if we could capture some body movement, it doesn’t matter where that body part would be. We could still let people control a system.”
With their approach validated, the team actually began to build the Dots system, using a Hololens, accelerometer stickers, and the interactive software engine Unity. And they began working with people with actual disabilities, like amputee Alex Lewis, to further develop Dots. “The way he used it is, he put the two dots on his arm when he did bigger motions,” says Gong. “But when he types, he puts one dot on his mouth, and just bites so the system recognizes his movement as clicking. So people can really be creative with our system.”
For now, Dots is a working prototype that needs to be set up especially for each person using it. In the future, the team knows that for Dots to succeed, it needs to adapt to its user’s needs perfectly. So they are developing an algorithm, which will allow someone to put the dots anywhere they’d like on their body, then perform a few calibration steps to automatically tailor the software to fit the gestures.
As for commercializing the technology, so far, the team is choosing to avoid accelerators and investors, instead developing the platform so it’s ready as augmented and virtual reality become more popular.
“We really don’t want to rush this,” says Gong. “We want to do the right thing at the right time.”
See more honorees from the 2020 Innovation by Design Awards here.