Fast company logo
|
advertisement

A new book, “Automating Inequality,” catalogs the ways government technology for the poor often ends up being punitive rather than uplifting.

Algorithms Are Creating A “Digital Poorhouse” That Makes Inequality Worse

[Image:123dartist/iStock]

BY Adele Peters7 minute read

In Los Angeles, an algorithm helps decide who–out of 58,000 homeless people–gets access to a small amount of available housing. In Indiana, the state used a computer system to flag any mistake on an application for food stamps, healthcare, or cash benefits as a “failure to cooperate;” 1 million people lost benefits. In Pittsburgh, a child protection agency is using an algorithm to try to predict future child abuse, despite the algorithm’s problems with accuracy.

In a new book, Automating Inequality, Virginia Eubanks calls these examples of the digital poorhouse: tech-filled systems that come from a long history of cultural assumptions about what it means to be poor. In the 1800s, when actual, prison-like poorhouses were common, some politicians embraced the idea that people should only get assistance if they were willing to live in the poorhouse. The conditions were so bad that they thought it would discourage “undeserving” poor–who were seen as not working hard enough–from supposedly taking advantage of the system. By the late 1800s, the “scientific charity” movement started collecting data and opening investigative cases to decide who was deserving and undeserving.

New technology used in public services, Eubanks argues, comes out of the same old thinking. “It’s really important to understand that these tools are more evolution than revolution, even though we talk about them often as disruptors,” she says.

[Image:123dartist/iStock]

She started thinking about this use of technology in 2000, while talking with a young mother on public assistance who was using an EBT card, a payment card for food or cash benefits. “At the time, they were fairly new, and people were pretty excited about them–rationalizing that they were easier to use, there’s less stigma, you look like every other shopper when you’re in the grocery store,” Eubanks says. “She said ‘Yeah, all of those things are true, and it’s more convenient in a lot of ways, but also, my caseworker uses it to track all of my purchases.'”

The push to automate and computerize public services began early, in the late 1960s and the 1970s. Eubanks believes that it was a direct response to a national movement for welfare rights–people who were barred from getting welfare in the past, like people of color or never-married mothers, were suddenly able to participate. Some older rules were overturned in the courts, like the “substitute father” rule that a mother on public assistance shouldn’t get that support if she was in a relationship with a man (the rule led to welfare workers invading homes in the middle of the night to check beds for boyfriends). As welfare rights grew, so did a backlash. Technology–touted as a way to distribute aid more efficiently–began to serve as a barrier to limit the number of people getting support.

“I think that these technologies certainly were used to create efficiencies and to ease administrative burdens, but they were also used to help us avoid the very difficult political conversation that we needed to have at that moment,” Eubanks says. Instead of talking about how to deal with economic inequality or automation of jobs, “We replaced them with a set of questions that are really systems engineering questions: How do you get the most input out of the least output? How do you identify fraud and divert people from eligibility?”

In some cases, the intent to limit aid is more obvious. In Indiana, Republican governor Mitch Daniels launched a welfare reform program in 2006 that he argued would “clean up welfare waste” by automating and privatizing eligibility processes, and rejecting any applications with errors. In one family, a six-year-old girl with cerebral palsy was on Medicaid. When her parents applied for health insurance of their own, and then temporarily put that application on hold, the system automatically considered that a mistake–and punished the family for it by canceling Medicaid for all their children. Because of the automated system, denials of food stamps, healthcare, and cash benefits in the state increased 54%. Food banks ran out of food. Poor and working-class families organized, and the governor finally admitted that the program was a “flawed concept” and switched to a hybrid eligibility system that uses caseworkers along with automation (the new system was still flawed).

[Image:123dartist/iStock]

In other cases, tech may bring benefits to some, but still has problems. In L.A., the algorithm that prioritizes homeless people for housing has, in fact, helped some people get housing or other assistance. After homeless people take a survey, they’re ranked, and the system tries to prioritize those who could most benefit from either short-term help with getting a rental or permanent supportive housing. But the algorithm doesn’t address deeper systemic issues. Thousands of others still don’t have a place to live. Eubanks says that program administrators see it as triage–a way to help make very difficult decisions in a humanitarian crisis, as if it were a natural disaster, which means they don’t look at it as a solvable problem.

“Homelessness is not a natural disaster,” she says. “It’s a disaster that was created by human decisions, and that idea that there’s not enough housing is a political choice. Los Angeles chose not to build enough public housing for a city of its size; it has [about] a tenth of the public housing of a city of its size. That’s a political choice. Skid Row chose to knock down 13,000 rooms of housing. That’s a political choice. To now say it’s like a natural disaster to me is a little disingenuous. We created this disaster. We can’t now outsource the results or the consequences of that disaster to a computer.”

Voters in L.A. passed a $1.2 billion bond in 2016 aimed at providing housing for homeless people, and the city seems to want to correct the mistakes of the past. But Eubanks argues that there are still many hurdles, including where new housing will be located, and that the prioritization system runs the risk of making other Angelenos feel complacent that those who most need help are getting it.

advertisement

The algorithm also relies on a survey that asks intrusive questions, including about someone’s sexual history and whether they deal drugs, and then requires the homeless person to sign a consent form to share that data. More than 100 other agencies can access it. Law enforcement can request some basic data from the database without a warrant.

These problems are worth worrying about on the basis of equity; with half of Americans living below the poverty line at some point in their adult lives, it’s not a marginal issue. But the problems faced by the poor now–of intense scrutiny and data collection about their private lives–also are likely to follow for the middle class. Trump’s budget proposal, for example, promises to save billions in middle-class entitlement programs like unemployment, Social Security, and disability, by identifying fraud, reducing improper payments, and collecting and sharing more data. It’s the same argument that Indiana used to set up its program for welfare eligibility, which led to a million denials.

[Image:123dartist/iStock]

“I think we’re really headed for a pretty epic disaster,” says Eubanks. “We have this moment to decide whether or not we’re going to have these difficult political conversations we’re trying to avoid, or whether we’re going to automate inequality. And it’s really like now’s the moment to step in, even if you are not going to be among the majority of Americans who are poor at some point in their life.”

Anyone who works in technology or government should read the book. Eubanks proposes a version of the Hippocratic Oath for data science and suggests that, at a minimum, anyone building this type of tool should consider whether it increases the self-determination of poor people–and whether it would be tolerated if it was aimed at anyone other than poor people.

Digital tools for the poor aren’t inherently bad. An app called MRelief, as one example, helps someone figure out if they’re eligible for food stamps without filling out intrusive forms–so if they’re not eligible, they don’t have to share that private data with the government. Government, too, could design better services using technology. But some underlying cultural assumptions have to change.

“We have these incredibly persistent stories about poverty in the United States,” Eubanks says. “We have to do really deep cultural work to change the stories we tell about poverty, and until we do that, we’re going to continue to produce these tools that profile, police, and punish poor and working people…right now our politics around poverty is more about moral classification than universal support.”

Real change will require large-scale organizing. “We’re in a moment where we need another National Welfare Rights Movement, or we need another moment of political organizing led by the poor across lines of race and culture, to create the conditions where we’ll get better politics and better technology,” she says. “And I do think that moment is actually happening.”

Recognize your brand's excellence by applying to this year's Brands That Matters Awards before the early-rate deadline, May 3.

ModernCEO Newsletter logo
A refreshed look at leadership from the desk of CEO and chief content officer Stephanie Mehta
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Adele Peters is a senior writer at Fast Company who focuses on solutions to climate change and other global challenges, interviewing leaders from Al Gore and Bill Gates to emerging climate tech entrepreneurs like Mary Yap. She contributed to the bestselling book "Worldchanging: A User's Guide for the 21st Century" and a new book from Harvard's Joint Center for Housing Studies called State of Housing Design 2023 More


Explore Topics