Fast company logo
|
advertisement

TECH

What you don’t know about your health data will make you sick

You can’t opt out of a shadowy system that’s hungry to know everything about your health—and without knowing it, you may have opted-in to share even more.

What you don’t know about your health data will make you sick

“There are a lot of loopholes to HIPAA: about what information is actually protected, who it’s protected from, and whether or not you have waived that protection via your own consent.” [Photos: American Colony (Jerusalem). Photo Dept./Library of Congress; Extezy/iStock; berry2046/iStock]

BY Alex Pasternack10 minute read

This story is part of The Privacy Divide, a series that explores the fault lines and disparities–economic, cultural, philosophical–that have developed around data privacy and its impacts on society.


Every time you shuffle through a line at the pharmacy, every time you try to get comfortable in those awkward doctor’s office chairs, every time you scroll through the web while you’re put on hold with a question about your medical bill, take a second to think about the person ahead of you and behind you.

Chances are, at least one of you is being monitored by a third party like data analytics giant Optum, which is owned by UnitedHealth Group, Inc. Since 1993, it’s captured medical data—lab results, diagnoses, prescriptions, and more—from 150 million Americans. That’s almost half of the U.S. population.

“They’re the ones that are tapping the data. They’re in there. I can’t remove them from my own health insurance contracts. So I’m stuck. It’s just part of the system,” says Joel Winston, an attorney who specializes in privacy and data protection law.

Healthcare providers can legally sell their data to a now-dizzyingly vast spread of companies, who can use it to make decisions, from designing new drugs to pricing your insurance rates to developing highly targeted advertising.

It’s written in the fine print: You don’t own your medical records. Well, except if you live in New Hampshire. It’s the only state that mandates its residents own their medical data. In 21 states, the law explicitly says that healthcare providers own these records, not patients. In the rest of the country, it’s up in the air.

Every time you visit a doctor or a pharmacy, your record grows. The details can be colorful: Using sources like Milliman’s IntelliScript and ExamOne’s ScriptCheck, a fuller picture of you emerges. Your interactions with the health are system, your medical payments, your prescription drug purchase history. And the market for the data is surging.

Its buyers and sharers—pharma giants, insurers, credit reporting agencies, and other data-hungry companies or “fourth parties” (like Facebook)—say that these massive health data sets can improve healthcare delivery and fuel advances in so-called “precision medicine.”

Still, this glut of health data has raised alarms among privacy advocates, who say many consumers are in the dark about how much of their health-related info is being gathered and mined.

Typically, Americans’ health data enjoy stringent privacy protections under the Health Insurance Portability and Accountability Act (HIPAA), which passed Congress in 1996. By law, your health information is only meant to be shared with your name, address, and other personally identifying information omitted. Drugmaker GlaxoSmithKline, for instance, now buys anonymized sets of data from DNA testing firm 23andMe.

Yet not all health-related information is protected by privacy rules. Companies can now derive insights about your health from growing piles of so-called “alternative” data that fall outside of HIPAA. This data—what some researchers refer to as your “shadow health record”—can include credit scores, court documents, smartphone locations, sub-prime auto loans, search histories, app activity, and social media posts.

Your health data can be deployed in alarming ways, privacy experts say. Insurance companies can raise your rate based on a photo on your Instagram feed. Digital advertisers can fold shadow health data into ads that target or discriminate against you. It can even seem invasive and predatory. One trend among personal injury lawyers, for example, is geo-targeted ads to patients’ phones in emergency rooms.

“It’s not that straightforward,” says Winston. “There are a lot of loopholes to HIPAA: about what information is actually protected, who it’s protected from, and whether or not you have waived that protection via your own consent.”

He adds that some doctors’ HIPAA forms actually include a provision that can remove a patient’s privacy rights. You might sign it without reading—or try to read it, and not understand it.

“The sale of your data isn’t explicitly described in your agreement with your doctor, so from their point of view, there’s nothing for you to object to,” explains Richie Etwaru, the founder and CEO of Hu-manity.co, one of a number of emerging startups that aim to help consumers control and even sell their own health data. “But to make it worse, anonymity is increasingly not guaranteed.”

Indeed, even when personal health data is formally anonymized in accordance with privacy rules, research has shown that such data can still be de-anonymized and linked to you.

Photo: Cory Doctorow / Flickr

Uniquely valuable health data is also increasingly the target of hackers, ransomware attacks, breaches, or what some patients call just plain shadiness, which has led to litigation and can ultimately further undermine trust in the healthcare system. A 2017 breach at a New York hospital leaked sensitive information about more than 7,000 patients, including addiction histories, medical diagnoses, and reports of sexual assault and domestic violence. Criminals can use that kind of data to commit identity and insurance fraud.

“There’s a great deal of trust that’s placed in our interactions with doctors and healthcare institutions,” says Mary Madden, research lead at Data & Society, who studies consumer and health privacy. “The current process of seeking consent for data collection and use in many health settings is often treated as an administrative afterthought, rather than a meaningful exchange that makes patients feel empowered and informed.”

Specialty reports and shadow health data

For consumers, the harms folded into collecting and sharing health data can feel especially acute when the data is incomplete, stale, or inaccurate. And research suggests that it often is.

Winston has worked on behalf of people who have disputed their so-called “health risk” score, a product created by consumer reporting companies that specialize in collecting health-related information about consumers.

These firms size you up for their clients, which are mostly insurance companies. It’s “mostly” at this point because insurers have established a system that requires us to sign away our federal medical privacy rights in order to apply for life, disability, and long-term care insurance. (This no longer applies to health insurance, thanks to the Affordable Care Act, though the Trump administration wants Obamacare’s ban on pre-existing health conditions to be ruled unconstitutional.)

Your health-related data are compiled into a specialty report akin to the consumer credit reports made famous—or infamous—by Experian, Equifax, and TransUnion. Insurers claim these reports are crucial to evaluating and pricing risk, and they can use this data to raise your rate, or to deny your application entirely. If your application is rejected—it’s called an “adverse event”—you are legally entitled to receive a copy of your specialty report and to potentially dispute an error.

advertisement

One of Winston’s clients was the victim of identity theft, and requested their report from Optum. “It was 26 pages of somebody else’s prescription history—which was causing them a denial of a life insurance policy that they were applying for,” he says.

In 2015, the Federal Trade Commission followed up on a 2012 study that investigated the accuracy of these reports. The FTC found that the three major agencies screwed up 25% of the time—that is, 1 in 4 people found an error on their credit report—and fixed some, but not all, of those errors. Many of these errors impacted a financial decision: For example, about 20% of people were placed in a higher-risk tier that would have led them to pay more for an auto loan, with a higher interest rate.

“With these specialty credit reporting agencies, there’s no comprehensive study that’s been conducted,” says Winston. “So the error rate [for the credit agencies] is at least 25%.” For the handlers of health data, he says, “I wouldn’t expect it to be better.”

To improve the accuracy of their data and learn more about your health, insurance companies and advertisers can now tap a growing number of providers of alternative data.

Data aggregators and their clients—insurance companies, advertisers, and yes, credit reporting agencies—now tap a seemingly endless number of “alternative data” providers. Your shadow health record is drawn from, for instance, your transactions at gyms, vape shops, health food stores, and your interactions with websites, sleep trackers,  medical devices, internet-connected exercise bikes, smartwatches, wearable fitness trackers, blood glucose monitors, pacemakers, and the wide, wide world of wellness apps. In a study published this week in The BMJ, researchers found that 19 out of 24 popular health-related Android apps shared user data to third and potentially fourth parties, including medical conditions, favorite pharmacies or doctors, and even whether a user is a smoker or pregnant.


Related: Watch out: Your private health app data may impact your credit report


All of this health data can be compiled with even more details about you, such as what’s known as the social determinants of health, including “how much you do those things, the people you hang with, the places you go,” said Nicole Gardner, vice president of IBM’s Services Group at a September 2017 “Beyond HIPAA” hearing held by the National Committee on Vital and Health Statistics. “So the region you live, the regions you have traveled to, all of that complexity and that added information, that texture,” she added.

Even Amazon Echo’s connection to a calendar full of doctor’s appointments alongside Uber or Lyft’s trip tracker counts as health data, said Forrester Research analyst Fatemeh Khatibloo. Google Nest’s thermostat also counts, she added—especially if a woman is experiencing hot flashes due to menopause, for example.

“Many people don’t understand that the data from a Fitbit or other health wearable or health device can actually be sold and is, in fact, today being sold. It is being sold for behavioral analytics, for advertising targeting. People don’t understand that is happening,” she told the committee. (After this story was published, a Fitbit spokesperson sent Fast Company a statement saying that the company does not “sell customer personal data, and we do not share customer personal information except in the limited circumstances described in our privacy policy.”)

The demand for all this data is rising, as it has for years. The health data market was approximately $14.25 billion in 2017, according to BIS Research. The firm predicts that in just under seven years—by the end of 2025—the market will grow nearly five times bigger, to $68.75 billion.

Paging Dr. Blockchain

Gardner predicted that traditional health data systems—electronic health records and electronic medical records—are less than ideal, given the “rigidity of the vendors and the products” and the way our data is owned and secured. Don’t count on them being around much longer, she said, “beyond the next few years.”

The future, Gardner suggested, is a system that runs on blockchain, which she defined for the committee as “basically a secure, visible, irrefutable ledger of transactions and ownership.” Still, a recent analysis of over 150 white papers revealed most healthcare blockchain projects “fall somewhere between half-baked and overly optimistic.”

As larger companies like IBM sign on, the technology may be edging closer to reality. Last year, Proof Work outlined a HIPAA-compliant system that manages patients’ medical histories over time, from acute care in the hospital to preventative checkups. The goal is to give these records to patients on their phones, and to create a “democratized ecosystem” to solve interoperability between patients, healthcare providers, insurance companies, and researchers. Similar proposals from blockchain-focused startups like Health Bank and Hu-manity.co would help patients store and share their health information securely—and sell it to researchers, too.


Related: Privacy in 2034: A corporation owns your DNA (and maybe your body)


Other privacy-first technologies could help. For instance, an emerging cryptographic method called homomorphic encryption could allow organizations—and eventually you—to access and analyze sensitive personal data without decrypting it.

Technical solutions aren’t necessarily the only way to keep your health data safe and secure. Two recently proposed rules by the Department of Health and Human Services would introduce new transparency for consumers, requiring that healthcare providers, vendors of electronic health record systems, and insurers provide patients with easy and cheap access to their health data.

Shadow health data is coming under scrutiny, too. In January, New York State established established a first-in-the-nation rulebook restricting how life insurance companies may collect alternative data such as social media posts. The rules require insurers to demonstrate that their data isn’t being used to discriminate against consumers.

Still, given the sluggishness of healthcare bureaucracies and growing demand, solutions for keeping your most intimate health data safe—and sharing it securely—won’t arrive quickly. And they probably can’t come soon enough.

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Alex Pasternack is a contributing editor at Fast Company who covers technology and science, and the founding editor of Vice's Motherboard. Reach him at apasternack@fastcompany.com and on Twitter at @pasternack More


Explore Topics