advertisement
advertisement

When your health depends on a wearable

For the well, strapping on the latest wearable might be a lifestyle choice. But for the chronically ill, living with a wearable is a more complicated experience.

When your health depends on a wearable
[Photo: Click_and_Photo/iStock]
advertisement
advertisement
advertisement

On Christmas day several years ago, I opened a package from my father. It was a Fitbit, with a Walmart gift receipt taped to the packaging. My dad had mailed it to California because he is a dad born in a certain year, when the latest electronic was always the most covetable gift.

advertisement
advertisement

At the time, I couldn’t tell you why the thought of using the Fitbit rankled me. But years later, it’s become clear. To my well-meaning father, it didn’t matter that I’d tracked my body’s functions since I was diagnosed with Type I “juvenile” diabetes at 14, siphoning blood from my finger into a test strip five or six times a day. We were entering an age of optimization, and he wanted to give me access to those insights, as effortlessly as donning an ugly silicone bracelet. As I’d moved from glass bottles and syringes to preloaded pens, from a beeping, Tamagotchi-like blood glucose monitor to a sleeker version in a molded carrying case, I’d discounted tech’s entrée into the health sphere. Silicon Valley’s biohackers and suburban keto adherents alike were now blithely choosing to gather data about their bodies, a decision that was foisted on me and other chronically ill folks as a medical necessity.

It makes sense that wearables, a booming industry valued around $24 billion that runs the gamut from Apple Watches to simple pedometers, would collide with chronic illness as technology grew more sophisticated and the medical industry sensed an opportunity for streamlined care. That old holiday gift now feels as clunky as a flip phone compared to the strides wearables have made in accuracy and measurements over the past five years, even if a lot of wellness devices inflate their abilities. There’s money to be made in medical tech just like any other variety (see: the myriad financial implications of COVID-19). And on a more altruistic note, there’s data to be gathered—for a patient, for a provider, and for a larger public health intervention, explains Dr. Robert Furberg, a research health informaticist at RTI International. He says that sensor data has three distinct capabilities in a health context: pure observation, intervention based on those observations, and future disease forecasting.

Sometimes the fact that one device shows something similar to another device makes them seem equivalent.”

Matteo Lai, Empatica

As a result, wearables and their associated apps are proliferating across a gamut of diseases and conditions, from portable dialysis machines to chronic pulmonary condition COPD. But there’s a substantial difference in the stakes between accurately counting calories burned by a well person on a treadmill and sensors predicting the onset of an epileptic seizure, for example. As nebulous as it is to categorize people into the well versus the sick, it’s equally hazy for most consumers to distinguish between a wellness device and a reliable piece of medical technology that must go through rigorous, expensive FDA testing that a private tech company could (and usually would) forgo.

advertisement
advertisement

The underlying technology that powers a Fitbit and a motion-sensing medical device might be very similar, but the sensitivity or quality of its sensors and the algorithms each one uses to draw conclusions could span the difference between a budget step counter and a sophisticated diagnostic tool.

“Sometimes the fact that one device shows something similar to another device makes them seem equivalent,” says Matteo Lai, cofounder and CEO of a company called Empatica that makes the Embrace 2, an FDA-approved, doctor-prescribed wearable that detects epileptic seizures.

“Let’s take Fitbit—you cannot use a Fitbit officially to adjust the dosage of a drug for insomnia because it’s showing a guy not sleeping. It’s not officially approved for that. It can do the job, sort of, but it can’t claim that,” he says. “And the difference between that is huge, because the difference between running a clinical trial for three years, collecting the data, and submitting to the research community, to the FDA—there’s a very rigorous process. You have a quality system that you have to maintain. You have to do things in a certain way, which is not like the startup of five people, the ‘kind of’ testing, the ‘kind of’ works. ‘Kind of works’ doesn’t really work, in this case.”

advertisement

The sprawling wearable data complex

The Embrace 2—like other medical-grade, FDA-approved devices—is collecting a barrage of data at a higher rate, which it sends to one of two apps developed for both patients and caregivers and their providers. As the field expands, with new interventions developing for everything from glaucoma to diabetes implantables to chronic pain management, so does the aggregate information that these manufacturers (and potentially numerous other stakeholders) possess.

“In the app market, data is very much the currency,” says Quinn Grundy, an assistant professor in nursing at the University of Toronto and a senior lecturer in pharmacy at the University of Sydney.

In a study published last year in the medical journal BMJ, Grundy and five other researchers analyzed the data-sharing practices in popular medical-related wearables and apps. “Many of the big app developers and companies are still running losses in terms of revenue, but are valued very highly, and we suspected this is because of the data they could collect.”

advertisement

In the app market, data is very much the currency.”

Quinn Grundy, University of Toronto

Between opaque privacy policies and a regulatory Wild West, apps and wearables can fall into the liminal space between the FDA and the FTC, depending on their claims, so Grundy and her team followed information as it flowed into and out of 24 apps.

“I think what we took away from this is that even if an individual app only picks up a couple of pieces, a few little breadcrumbs from one app—it’s then deidentified or aggregated within a wider mobile ecosystem, and people who are centrally positioned within that ecosystem can collect pretty detailed profiles of an individual, even if they don’t know their name,” she says.

Information is collected passively, like through browsing history or time zone, or input manually, like medication lists or the timing of symptoms. From there, it flows through a digitized tributary. First comes infrastructure, like Amazon Web Services or cloud storage (fairly anodyne), then developers (depends on what they do with it), and the last stop is parent companies (potentially insidious).

advertisement

“When we looked at the developers themselves, some of them were using that information to market back to users their own products and services. Some actually reported selling deidentified or aggregated data to interested stakeholders, and others were a little bit more vague about what was done,” Grundy adds. “A lot of them were using it for very tailored and targeted advertising.”

Fine, have fun with the depersonalized readings from my continuous glucose monitor, Big Tech.”

There’s admittedly a part of me that thinks, “Fine, have fun with the depersonalized readings from my continuous glucose monitor, Big Tech.” But the implications stretch beyond the annoyance of targeted ads or junk mail. Adopt a Galaxy Brain view and even a stream of faceless numbers could dramatically alter the lives of individuals (see: Target’s detection of a teenager’s pregnancy before her parents figured out) or larger groups (Grundy invoked the Cambridge Analytica scandal, which mined fairly basic data, initially shared voluntarily, that might have altered the course of a presidential election).

It can get dystopian if you forecast future worst-case scenarios: public policy around mental illness based on the number of wearables detecting depression, crackdowns on addiction because implantables showed agitation in a certain population, price gouging because a pharmaceutical company had cornered some particularly vulnerable geographic market. It’s harder to imagine the dire consequences of data analysts learning that people love running in Central Park, or that most people go to the gym on Tuesdays—yet another illustration of the vast difference between a device that might better your quality of life and a device that is, itself, a lifeline.

advertisement

The reality of living with a medical wearable

What researchers and experts couldn’t answer was the reason this much data feels so psychologically onerous. I called one of my best friends, Nick Andersen, a podcast producer in Cambridge, Massachusetts, who also wears a continuous glucose monitor. He understood why something so outwardly helpful, a machine that measures my biometrics effortlessly and consistently, can seem like such an albatross.

“I’m so grateful for [my continuous glucose monitor], but it’s a constant thing. It makes it more present, even if it’s just to me,” he mused. “Yeah, it does feel like a little bit of a burden, even if the burden is the burden of improved care. I’m conscious of that contradiction.”

Maybe part of my wariness of something so innately marvelous—No more finger sticks! A graph that charts our bodies’ trajectory throughout the day, with no two-hour mystery gaps between one reading and another!—is the fear of overreliance. The pace at which technological innovations move is certainly faster than the FDA; it also feels rapid when, in 16 years, you’ve moved from a clunky meter the size of a flip phone to a discreet, flat oval affixed to your upper arm.

advertisement

Nick, who has friends that use an insulin pump, notes that beneath the overt convenience of punching a cartridge worn on the skin and receiving medicine, there are still pitfalls: technology going rogue, unpredictable failures, issues that crop up unexpectedly, albeit infrequently.

In the meantime, I stay wary of outsourcing a life-or-death process to the cloud.”

“They get used to the technologically enabled ease of a pump, and that really helps them not think about it,” he says. “Which, you would think you and I would both be like, ‘Let’s do it!’ But they miss things, because they don’t have to think about it. I don’t enjoy the medical and psychological burden that diabetes places on me, but I also think I would enjoy even less losing that ability to make choices.”

Give a well person a wearable and they can laud or berate themselves for miles walked, carbohydrates ingested, a rough approximation of their levels of stress; give one to someone burdened with a long-term disease and, all at once, it will help tighten control and prove the futility of such a concept. Bodies, like data sets, are known to follow predictable patterns and then veer suddenly off course; how I digest a lunchtime apple depends entirely on my body’s machinations, which aren’t nearly as clean as a well-tested algorithm. Wearable tech will better sick people’s lives, and once it starts replacing organs altogether, the benefits could outshine any consideration of privacy or ownership of data. In the meantime, I stay wary of outsourcing a life-or-death process to the cloud. A biohacker by choice can forget this reasonable amount of suspicion, but I hang onto it just in case.

advertisement

By that January, I’d driven to the nearest Walmart and exchanged the Fitbit for a pair of noise-cancelling headphones and a bag of cat food. My body was not as optimized, but my mind was elsewhere, on a record that drowned out the sounds of construction in my neighborhood.

“People are really wrestling with how to find the signal to noise ratio—the expectation that there’s a ton of garbage that you can ignore, but how do you find what’s actually meaningful,” Furberg says of this new data frontier. The world comes with its share of static, and attention is a commodity—we only have so much to dole out, toward apps or metrics or fitness. The goal is unthinking automation; what a beautiful, terrifying thing.