With the introduction of the Apple HealthKit late last month, the burgeoning wearable sensor market, and other digital health solutions coming onto the market, digital technology is on the verge of disrupting medicine in profound ways. Wearable technology and mobile phones have become an important element in addressing critical needs that health care faces–fewer nurses, expensive readmissions, and age-old inefficiencies.
In the design world, we are excited about using technology to impact so many lives. We are obsessing over visual interfaces: What is the best screen size in smartphones? Do we like the thickness of the Apple Watch? What are the ethics of wearing Google Glass?
While we argue over 2014-era design issues, a more profound development is happening in labs around the world. Implantable, microscopic sensor technology will soon change our fundamental relationship with technology. Advancing sensor technology has already started to create an entirely new market: invisibles.
We are living in the wearable era. Wearables bring technology and information into users’ consciousness. But they don’t rely on ambient intelligence, they’re not yet integrated into our environments, and they address micro information rather than the bigger picture of our health. They are a necessary step in the evolution of body computing, but a bigger step is about ready to overshadow wearables, comparable to the impact of the smart phone on a regular cell phone.
We study human behavior at our firm, and we have discovered that wearables get mixed reviews. When we study the new Apple Watch, we see a tremendous effort to address personalization. But will it displace the role of current watches?
Wearables can be put on or taken off, which takes away the type of continuous monitoring that creates intelligent and actionable data. The current state of wearables is an important step toward something better and bigger.
Invisibles will create a world in which we don’t see technology or sensors; they are seamlessly integrated into the human body. We won’t worry about slick aluminum, glass, or steel. Technology will become human. We will return to ourselves. We have projects in which minimally invasive sensors are implanted into the human body and the biometric data is seamlessly connecting to a mobile device. Medical device innovators are betting millions of dollars in the belief that invisibles will change behavior, help people adhere to new treatments, and create a better dialogue between caregivers and patients.
We are excited about Apple and Samsung’s efforts to create a digital platform, but we are especially excited about the next step, when the phone will be taken out of people’s hands and be replaced by something invisible, which will allow humans to eat, listen to music, make connections, create, play games, stay healthy, travel–all those things that make our lives truly rich–without any interruption from machines.
For example, we have a seven-year relationship with Starkey Hearing Technologies, the largest manufacturer of hearing aids in the United States and a major global player. We have developed a deep understanding of hearing aids, not often thought of in today’s “wearable” context, but a wearable nonetheless. The way users relate to the audio interface is the most important part of the user experience. Hearing aids face many of the same adoption and compliance challenges as other medical devices: stigma, target segment challenges, and the technology IQ of users. This makes them a good microcosm for the broader implications for the interface and design of medical products.
So how do you make them disappear? We created original (yet intuitive) ways to control the hearing aid, such as the natural gesture of sweeping back your hair to avoid drawing attention to the device. We are trying to make the device “invisible” through other ways, such as reducing its size and creating colors that blend with hair and skin tone. And they are getting smarter: hearing aids can be geotagged so the next time a hearing aid user walks into his favorite restaurant, it will automatically adjust to the noise levels.
Through our work on invisible hearing aids, we discovered that ear-based wearables provide information more naturally than screens, and they are essentially invisible if designed well. Neuroscientists at the University of Glasgow found that the visual cortex of the brain actually processes auditory impulses (in addition to visual ones) to provide context to sights.
I believe that invisibles–minimal, simple, intuitive devices that are seamlessly integrated into our lives–will create a tipping point in the adoption of wearable sensors, and will revolutionize health and wellness. To create invisibles, designers will have to start with wearables and start thinking about how to turn someone into a body computer, rather than how to persuade these bodies to interact with computers:
- Understand users’ habits and ceremonies to fully understand where to blend new behaviors.
- Focus on the value and the meaning and less on the device. Reduce interaction with device. Users prefer less device interaction.
- Bring ambient intelligence into the mix if possible that is anticipatory and context-aware.
- Eliminate friction between technology and people.Reduced friction will have a big impact on adoption and ease of use.