advertisement
advertisement

Google’s new experimental app could help blind and low-vision users run unassisted

All it takes is a mobile phone, bone conduction headphones, and a painted line on the ground.

Google’s new experimental app could help blind and low-vision users run unassisted
advertisement
advertisement
advertisement

Today, if someone who is blind or low-vision wants to walk or run, they can only do so with the help of a guide dog or a sighted individual. A new, early-phase research project might change that.

advertisement
advertisement

Created in collaboration between Google Creative Lab and Google Research, Project Guideline is a system that uses machine learning technology to help blind and low-vision users run or walk independently. It won Fast Company’s 2021 Innovation By Design Award in the Experimental category – and it has already helped one man, who is blind, run a 5K, unassisted. All it took was a mobile phone, bone conduction headphones, and a painted line on the ground.

[Photo: Google]
Project Guideline was designed in collaboration with nonprofit Guiding Eyes for the Blind and its CEO, Thomas Panek, who lost his vision in his early 20s. “Rather than trying to make a thing for everyone, let’s partner with one individual who has a unique set of challenges,” explains Ryan Burke, production lead at Google Creative Lab, “and if we’re able to develop solutions for that individual, the hope is we can take that technology and slowly scale it to more people.”

[Photo: Google]
Here’s how it works: Equipped with a Google Pixel phone attached to a waistband, as well as bone conduction headphones, Panek starts running along a pre-painted line on the ground. The camera on his phone acts as a sensor that detects the line.  Meanwhile, an app on his phone leverages a machine learning model that interprets each frame from the phone’s camera and tags it as “guideline” or “not guideline.” This means that once the app is downloaded, the phone doesn’t need an internet connection and the process remains completely private.

advertisement

If Panek stays on track, he can hear a soft, high-frequency hum that assures him the device is working. If he starts drifting away, he’ll hear a sound that grows increasingly loud and dissonant the further away he drifts. And if he wanders too far out (that distance can be adjusted depending on the width of the path) a human voice will take over and shout “stop.”

[Photo: Google]
Burke likens this process to an “audio valley,” and highlights the importance of conduction headphones, which deliver sound vibrations through the jaw as opposed to the ear canal. This allows the user to hear atmospheric sound and receive little vibrations on the head as an additional warning.

Google Creative Lab has an entire audio team working on this auditory user interface. “It’s one thing for machine learning to recognize the world and it’s another to communicate it effectively to a person who can’t see the world,” Burke says.

advertisement

To train the model, the team collected hundreds of thousands of images of yellow lines in all kinds of scenarios, both in person or, when the pandemic made it difficult, digitally.

In November 2020, Panek completed a 5K on Central Park’s northern loop, after the track was cleared and a temporary line was painted on the ground. Prior to the trial, however, a sighted member of the team tested the technology on-site and realized that yellow leaves were confusing the model because the data had mostly been collected in the summer. So the team turned to synthetic data—virtual worlds with running paths, guidelines, and other yellow adversarial objects. “We gave it hundreds of thousands of scenarios, and we taught it to ignore the yellow leaves,” he says. When Panek eventually ran the 5k, he had smooth sailing.

[Photo: Google]
Right now, the model doesn’t know how to detect and avoid obstacles like pedestrians, cyclists, or even rocks and sticks. More broadly, the team also needs to collect more diverse data that reflects a wider selection of cities, roads, weather conditions, times of day, and more.

advertisement

Burke says the most important thing right now is partnering with organizations and institutions that have the people and space to test out the technology, like schools and universities. In the meantime, the Project Guideline team is focused on further developing the auditory user interface as well as building out an external Trusted Tester Program to partner with more blind or low vision athletes.

Eventually, the team plans to switch from a yellow to a purple line, which would avoid any confusion with transport engineers. “If there’s a [new] line, you can communicate the meaning of that line,” says Burke. “If you put it in a park, you can tell the surrounding community this is a space meant to be shared.”

See more from Fast Company’s 2021 Innovation by Design Awards. Our new book, Fast Company Innovation by Design: Creative Ideas That Transform the Way We Live and Work (Abrams, 2021), is on sale now.

advertisement