Lots of companies are pouring resources into teaching cars to see the world around them. Now a startup called Strap Technologies is developing a wearable pod that uses some of the same kinds of sensors used by autonomous vehicles—radar, lidar, ultrasonic—to give blind people a clearer sense of their surroundings.
“Each sensor has a different resolution, has a different threshold,” said founder and CEO Diego Roel in a Zoom conversation last week in which he showed off a preproduction Strap. “We use the best of each sensor and we combine them.”
Strap’s chest-worn device weighs less than half a pound and is scheduled to go on sale next summer for $750. (It’s currently available for $500 on preorder.) It calculates the proximity of such hazards as walls, steps, nearby people, and bumps in a sidewalk. Then it conveys this information to the user via haptic feedback—its four straps vibrating according to a grammar that users have to learn.
“The pattern and the strength of this means where is the obstacle, how to avoid it, and how far away it is,” Roel says, adding that the two most expensive components are the device’s radar sensors and microcontrollers. He notes that the device is designed to run for 72 hours on a single charge.
Strap began work on its device about three years ago. “We underestimated the technology complexity we needed to [make] this device,” Roel says of the journey since. There are some 250 people currently testing the device, which has already drawn 200 preorders. Strap needs to refine the design to ready it for mass manufacturing.
The company, headquartered in Austin, with a research and development lab in Guadalajara, Mexico, has shown off earlier versions at such tech events as the Consumer Electronics Show and South by Southwest. But its site leaves much of the product’s workings unexplained, so two outside experts could speak only in generalities about Strap’s prospects.
“It is possible,” says Alexander Leonessa, director of Virginia Tech’s Terrestrial Robotics Engineering & Controls Lab, noting the rapid evolution of lidar sensors, which detect nearby objects and people with pulses of light and are now showing up in some high-end smartphones. “Lidars are getting better and better and cheaper and cheaper.”
But while Strap’s Roel contends “there is nothing like this in the whole world,” Leonessa says that he’s seen other high-tech efforts to replace the white cane for people who are visually impaired in his earlier role directing the National Science Foundation’s Disability and Rehabilitation Engineering project.
“I’ve seen many products like this, targeting the impaired vision population with the idea of replacing the white cane or providing the cane with additional information,” he says.
For example, he points to a paper published in October outlining a system that would use ultrasonic sensors in ankle- and waist-worn modules to generate audio guidance for blind users. Toronto-based iMerciv, which received a grant from Microsoft last year, sells a wearable, ultrasonic collision-warning sensor: the $249 BuzzClip.
Leonessa says he believes Strap’s next step should be “a very thorough study with the potential population.”
Chris Danielsen, director of public relations for the Baltimore-based National Federation of the Blind, says that the nonprofit organization received a promotional email from Strap “about how great their product was going to be,” but had not been asked for its input.
Danielsen observes that blind people have been early adopters of enabling technology. “In some ways, blind people were using GPS before drivers were,” he says, recalling apps for Palm handheld organizers.
Today, many people with visual loss or impairment rely on such smartphone apps as Microsoft’s Soundscape and Sendero Group’s Seeing Eye GPS, which convey data about a user’s surroundings via sound. But, Danielsen adds, the white cane still works.
“The way to think about a white cane is that it extends our sense of touch,” he says. “We’re not just randomly groping with them. There’s a technique to using them that’s been developed over the years and actually does a pretty good job of protecting us from obstacles.”
Danielsen also puts in a word for another time-tested form of assistance—guide dogs—noting that they can distinguish between indoor and outdoor places.
Roel, who can see, notes that the technology in Strap’s device allows for more precise guidance than a cane can provide. “Press a button on the device, and we trace an imaginary straight line on the floor,” he says. “We notify the user whenever they move aside from the straight line.”
Roel understands that even though a walking person moves much more slowly than a self-driving car, the task of guiding a visually impaired person is more difficult. He says that while you can expect cars to stay on roads, you can’t expect people to take any one path. Virginia Tech’s Leonessa concurs, adding that on a vehicle, the sensors are fixed firmly in place: “You know where you are looking from,” he says.
In terms of the challenge Strap faces in regard to its product development, Leonessa says: “Just because you’re looking a few feet in front of you, I don’t think that makes the problem easier.”