3 Ways Big Data Is Going To Be Used Against You In The Future

For some people, the benefits of big data will not be worth the risks–which could include increased workplace, police, and consumer discrimination.

3 Ways Big Data Is Going To Be Used Against You In The Future
[Image: Abstract via Shutterstock]

Taylor Rodriguez prepares for a flight. After she plugs in her schedule to the cloud, the rest is automated: Watched over by a “smart” streetlight, she leaves a bag to be picked up outside her door, takes a self-driving car to the airport, strolls through the gate, and sits in the seat designated by the display inside her contact lenses. It’s simple, breezy, and all the while she’s being monitored by the devices around her for signs of disturbances–in her walk, her emotional tenor, her face.


The White House came up with that scene, or rather the President’s Council of Advisors on Science and Technology did. Last week, the council released a report analyzing future “big data” scenarios we all may face, alongside a 90-day review of the big data practices led by White House advisor John Podesta.

Health care, crime, smart homes, education, law enforcement, employment–these are all areas in which big data has promised to deliver miracles. But are the tradeoffs of privacy for convenience (like Rodriguez’s) something we really want? If they are, how do we make sure that individuals maintain control over how our information is being used?

The researchers, experts, and privacy advocates I spoke to about the White House’s efforts to grasp the lightning speed of developments in the field of big data agreed on one thing: The 90-day review does show a laudable, wide-ranging understanding of some of the risks involved, especially when it comes to the way in which big data can discriminate against individuals.

But how the White House will follow up on the initial diagnostic remains unclear. The Podesta report did suggest a handful of recommendations, including juicing up a Consumer Privacy Bill of Rights proposed by the president in 2012 and amending the Electronic Communications Privacy Act, which currently does not protect emails from warrantless law-enforcement snooping. What is clear is that technological advances are outpacing legislative ones, and many experts fear that the keepers and brokers of our data will have the opportunity to take advantage of vulnerable groups long before lawmakers grasp what’s happening.

You might not be worried about it, but here are three ways in which big data practices might one day affect you.


Let’s say two men, Darnell and Geoffrey, apply for the same job. When their prospective employer googles them, Darnell’s web search brings up an array of ads offering mug-shot searches or criminal background checks. Geoffrey’s web search, meanwhile, shows innocuous ads for food processors and vacations to Honolulu.


As the White House review mentions, we already know from one study that web searches on black-identifying names–like “Darnell” or even “Latanya Sweeney,” who happens to be chief technologist at the Federal Trade Commission–have turned up more ads offering mug-shot searches and criminal-history finders than white-identifying name searches do. Algorithmic ad delivery might seem benign, but if first impressions are indeed important, even ad optimization can put entire communities at a real disadvantage.

It’s not yet clear how ads discriminate on the basis of names, but other marketing techniques have turned up similarly troubling practices. Data brokers like Experian and Acxiom use algorithmic discrimination to lump thousands of consumer profiles into demographic bundles that are then sold to marketers. In one Senate committee investigation, lawmakers found that data brokers were selling consumer profiles with labels like “Ethnic Second-City Strugglers” to help marketers (and whoever else) target these groups.

While the multibillion-dollar data industry has claimed that it only sell bits and pieces of de-identified data–personal details like online purchases and health conditions that avoid naming a full identity–studies have repeatedly shown that de-identified information can easily be reconstructed into a person’s real identity. When that happens, there’s no telling how employers, retailers, or lenders might use that information or connect it to other aspects of your life.


Darnell goes home after a failed job interview, but when he arrives, police are there. They’d like to talk to him because a partial DNA match at a crime scene showed that someone in his family could be a suspect.

Taking a cue from police departments in the United Kingdom, some U.S. law-enforcement authorities are using big data techniques to trawl partial DNA databases in criminal investigations. These types of leads are generated when DNA found at a crime scene resembles DNA in a criminal justice database–a related genomic sequence obtained from a convict, or in some states, misdemeanor arrestees. Familial searches, as partial DNA matches are also known, could help track down serious criminals, but also create many innocent suspects–the relatives of people in the system. Cops in a handful of states like California and New York have used partial DNA matching, but with mixed results.

“If we’re not careful with big data, people can be dragged into the criminal justice system,” explains Alondra Nelson, a sociologist at Columbia University. Her work focuses on the intersection of these issues–how, for example, partial DNA matching might create a dragnet of certain communities, not unlike the stop-and-frisk policies used by the police today. When the criminal justice system arrests disproportionate numbers of minorities, it follows that DNA databases could create even larger ripples of discriminatory policing.


The White House review also described efforts by police departments to identify geographical “hot spots” to predict car break-ins. But Kate Crawford, a researcher at Microsoft and MIT, points out that predictive algorithms used in this way could discriminate against entire neighborhoods. “That raises a lot of alarm bells to me from a basic social justice perspective,” she says.


After all this, maybe Darnell decides to go for a relaxing run. He uses an app to track his route, and at the very end, feel-good endorphins flowing, a notification pops up on Darnell’s phone showing a discount on Snickers just down the road.

It might sound laughably unethical, but the above could one day be an example of price discrimination based on mobile and online tracking, says Ryan Calo, an assistant professor at the University of Washington School of Law. If retailers can tap into consumer habits, they can also offer customers different deals based on what they know. Calo wonders what kind of impact these practices would have if they continue unheeded.

Part of the White House review called for a revamp of the Consumer Privacy Bill of Rights, a white paper issued by the White House back in 2012. The paper asserts that individuals have a right to exercise control over how personal data is collected and used.

But Calo doesn’t think the recommendations in the review go far enough. “I think that [the review] ably and wisely diagnosed the problem as being about discrimination and power imbalances, but that the solutions that it offered weren’t as thoughtful as the underlying diagnosis,” he says.

“I worry that profitable organizations will use big data to disadvantage consumers,” Calo adds. “Intimate knowledge of consumers coupled with being able design every aspect of interaction–that’s going to create incentives to exploit.”


About the author

Sydney Brownstone is a Seattle-based former staff writer at Co.Exist. She lives in a Brooklyn apartment with windows that don’t quite open, and covers environment, health, and data