Working on UI and UX for smartphones is difficult enough. It becomes even more difficult when your target population is disabled. But a growing number of companies, ranging from small startups to tech giants, are busy at work creating smartphone interfaces for quadriplegics and paraplegics. To serve users, designers have to embrace face tracking, eye tracking, voice controls, and some no-brainer design approaches.
For users who cannot move their own wheelchairs or groom themselves unassisted, being able to operate a phone grants a considerable amount of autonomy.
Giora Livne, an engineer who became a quadriplegic after sustaining injuries in a fall, says being able to operate a smartphone independently changed his life. “It’s like going from the stone age to the space age,” Livne told Co.Labs.
His company, Israel-based Sesame Enable, makes adaptive technology for handless operation of smartphones. Using a combination of face tracking through a phone’s front-facing camera and voice control, Livne—who has restricted facial mobility as well—can make phone calls and operate apps independently.
Sesame’s new Sesame Phone is a modified Google Nexus 5 which uses a combination of head movements and eye controls to replace touch-screen interfaces inside of Android apps. Sesame’s tech relies on an always-on front camera within the phone, which tracks the user’s facial movements. These movements are then translated into control of an on-screen cursor which substitutes for a finger on the touch screen. Voice commands are also used to supplement the face tracker.
The phone is now available for pre-order for approximately $1,000 through the company’s Indiegogo campaign; as of publication, the company has raised $23,000 for a goal of $30,000. Interest has been shown by larger companies as well; Verizon named the phone as a finalist in a $1 million education-tech competition.
Because it is dependent on nonstop camera operation, battery life is only one hour; Sesame CEO Oded Ben Dov says the device is designed to be used while plugged into a power source.
The video below shows Candy Crush Saga being played via Sesame.
Because both Android and iOS devices are equipped with an array of sensors like gyroscopes and cameras, they allow for some limited but interesting new ways to interface with the phone. The experimental SideSwipe, which was recently featured in Co.Labs, uses Minority Report-style hand gestures to replicate traditional touch screen swiping and tapping.
Although the SideSwipe model, of course, isn’t much use for disabled users, the concept is similar: with the right software, a phone’s sensors can allow for basic gesture recognition, allowing those with limited movement to send a message, play a game, or place an Amazon order. Link the phone to suite of connected devices, and it could be used as a kind of remote control for an entire home.
The Internet offers a small ecosystem of phone interfaces especially for paraplegics and quadriplegics. Smaller firms like Etoengineering and No Buttons Headset offer modified Bluetooth headsets operable by quadriplegics without assistance. Some vendors also make tablets with UIs designed for the disabled, like Sweden-based Tobii, which manufactures a Windows 8 tablet eye-tracking tool which replaces the conventional Windows UI with entirely eye-based controls.
And, of course, giants like Google, Samsung, and Apple have also given thought to how to make their UIs more accessible for users with limited accessibility. Apple quietly gave Siri control over more system functions for disabled users for instance, and large Android vendors like Samsung are increasingly offering new voice control functions for phones.
Last month, Samsung introduced the latest version of its EyeCan+ “eye mouse”—a portable box that sits below a monitor and, like the Sesame, allows disabled people to interact with the computer using facial movements. So far the company plans only to make a limited number of the devices and distribute them to charity organizations.
And just this week, Intel announced an update to Stephen Hawking’s portable communication interface, called ACAT (Assistant Context Aware Toolkit). The first overhaul of his computer in 20 years, it includes a version of SwiftKey and “artificial intelligence” coupled with an existing eye-tracking device, and is built on top of open source software; Intel intends to freely distribute the code in January.
Hawking says ACAT has doubled his typing speed. “The technology that is now being developed to now support the disabled is leading the way in breaking down the communication barriers which once stood in the way,” he said in a promo video. Lama Nachman, who manages the Anticipatory Computing Lab at Intel Labs, noted that, “Technology for the disabled is often a proving ground for the technology of the future.”
Although the list is a few years old, the Christopher and Dana Reeve Foundation has put together a list of apps with UIs easily accessible by wheelchair users. Sesame produces an SDK to create disability-friendly apps for iOS and Android as well.
Ben Dov, the Sesame CEO, suggests that app designers who want to create apps easily operated by wheelchair users emphasize large buttons to click on, and follow a design rule that that any user could love: Avoid clutter.