Go to a government technology or innovation conference and look around, and you’ll see that the majority of attendees tend to be white and male. In 2015, women made up 22% of the government tech workforce, while white men made up over 50%. White workers make up about 70%. The numbers are even more lopsided at the highest levels of tech companies.
Although women are underrepresented in technology, they make up a growing portion of the civic innovation workforce. On New York City’s Design and Product Team, for example, 6 of the 10 staff members are women. The staff of the United States Digital Service, which brings technologists into the federal government, is 50% female, and the leadership is over 60% female. The staff of Code for America, whose mission is to make the government work in the digital age, consists of 65% women or nonbinary, with 76% women or nonbinary people on the leadership team. And our own Public Interest Technology team at New America is run by a staff of all women, half of whom are women of color.
But despite the prevalence of women in civic innovation, we have a long way to go to make the demography of the field match the demography of the United States. And there’s an even bigger challenge to draw more leaders from the communities they serve. We heard from numerous practitioners in public interest technology that diversity of all kinds is critical to appropriately designing services for people.
Vivian Graubard, one of the founding members of USDS, discovered that her background as a Spanish-speaking first-generation American gave her insight into her immigration policy work that the rest of the team didn’t have. The USDS team was building a new system, and Graubard felt strongly, based on her personal experience, that it should exist in Spanish, as the majority of users would be native Spanish speakers.
“I have family members who speak English, but it’s not their first language,” she explained. “The complicated legalese of what you’re asking them is just beyond where they’re comfortable. I understand what people are going through when they’re using these systems—they are nervous. They don’t want to answer the question incorrectly. They don’t want to sign their name to something that could be wrong and then send it to the U.S. government. So it matters that they really understand what question they’re being asked.”
In New York City, a diverse set of perspectives on the team meant a better understanding of data on multiple projects. In 2017, the city decided to bulk up its long-standing fight against rats. By tapping into 311 call center data, the team figured they could pinpoint where the worst rat problems were in the city and target their efforts there. But when Amen Ra Mashariki, the chief analytics officer in the Mayor’s Office of Data and Analytics, looked at the data, he immediately saw a problem. Mashariki had grown up in a public housing project in the Bedford-Stuyvesant section of Brooklyn and still lived in the neighborhood. When he came home late at night from the mayor’s office, he saw rats scurrying about. So when he first got the 311 data, he did the normal thing you’d do when looking at data—he looked at his own neighborhood.
“I thought, that’s really weird,” Mashariki recalls.
According to the data, there wasn’t much of a rat problem in Bed-Stuy. No matter which way his team sliced the data, it kept showing very few rats in the neighborhoods that Mashariki knew from personal experience were rat-infested.
Mashariki called up an old friend who still lived in the projects and asked what had happened to all of the rats. According to the city’s data, he explained, there were no more rats there. His friend laughed. There were plenty of rats, he assured Mashariki.
“Then why doesn’t anyone call 311 to complain?” asked Mashariki.
“What’s 311?” his friend replied.
This story illustrates not only the importance of validating where your data comes from—study after study has shown that wealthier white people are more likely to lodge complaints with 311—but also why the people parsing the data need to represent a variety of perspectives. Had Mashariki not grown up in a primarily low-income neighborhood, he might not have seen the gaps in the 311 data. But thanks to his background and curiosity, he was able to bring a perspective to the project that was sorely needed.
Mashariki found blind spots in the data and elevated how neutral-seeming data of one type can amplify the voices of some over others. Because the public interest technology field, in its current state, skews white, it is important for practitioners to be aware of how the makeup of their team might affect the interpretation and application of that data. The tools we’ve described must be designed and driven by humans with a breadth of perspectives. They have the power to amplify our best or our worst traits—technology can be used to expose implicit bias and racism in our data collection, or to perpetuate it.
Yet it’s not Mashariki’s responsibility alone to consider the context in which social problems exist because he happens to have been raised in a poor neighborhood. The onus is on anyone who works with data sets to consider the limits of what those numbers can tell us about the lived experiences of the people they represent. A well-trained technologist from any background should be armed with the skills and knowledge to ask the types of questions Mashariki asked. That’s part of the training we hope to see happen for those who enter this field.
Diverse backgrounds have always been important in public problem-solving. When a cholera epidemic gripped 1830s London, the source was a mystery. At first glance, Dr. John Snow was an unlikely person to locate the source. He was a big-shot doctor who attended to Queen Victoria during several of her births. The cholera epidemic was largely confined to poor neighborhoods, so most doctors blamed the outbreak on the perceived filthy habits of the lowest classes. But while Snow’s work had taken him to Buckingham Palace, he had grown up, and continued to live, just a few blocks from the center of the epidemic.
“The poor were dying in disproportionate numbers not because they suffered from moral failings,” he wrote. “They were dying because they were being poisoned.”
Snow went on to map the outbreak data and traced the source to a contaminated well. Being “from the neighborhood” was as relevant to problem-solving in Victorian London as it is now.
“Yes, we have some diversity in policy. We’ll have certain senior leadership places in city government that focus on diverse initiatives, immigration, so on and so forth, that require diversity,” Mashariki said. “But there’s almost little to none in the tech space. Diversity isn’t just getting a young African American man from Iowa. If you’re the city government in New York City, diversity is getting someone who grew up in the projects to be a part of it. That’s a level of diversity that we almost never hit.”
From Power to the Public: The Promise of Public Interest Technology, by Tara McGuinness and Hana Schank, published by Princeton University Press and reprinted here by permission.