advertisement
advertisement

This Chatbot Makes It Easy To Document Your Interactions With The Police

Because filing formal complaints with the police is intimidating and time-consuming, a new chat-based application called Raheem is trying to create a more comprehensive database of how cops treat citizens.

This Chatbot Makes It Easy To Document Your Interactions With The Police
“The long and short of it is, we don’t actually know how people are experiencing police.” [Photo: Matt Popovich/Unsplash]

For every video of police violence that’s shared on social media, there are many thousands of people who have negative interactions with police and never report it. A new chatbot called Raheem aims to give more visibility to interactions between civilians and police: In less than five minutes, someone can anonymously report their experience with police–good or bad–using the chatbot, which adds it to a public dashboard of police performance.

advertisement

Founder Brandon Anderson lost his partner to police violence during a routine traffic stop a decade ago. When Eric Garner was killed in 2014 by a police officer who put him in a chokehold during his arrest, Anderson became more active in organizing around the issue.

“I’d been a community organizer for about 10 years, and after watching Eric Garner being strangled on live video–and the only person who went to jail was the person that recorded it–it sort of reinvigorated a sense of purpose in this work for me,” he says. He took time off from Georgetown University to volunteer in Ferguson, where he began to think about how to solve the problem of getting city officials to listen to a community.

“We’re going to give [cities] the tools they need in order to make that change, but they have to commit to this being a mission of theirs.” [Photo: Raheem]
“The long and short of it is, we don’t actually know how people are experiencing police,” Anderson says. Community members told him that they wouldn’t report a negative interaction with police if it wasn’t “that bad.” One woman told him that she had been searched by an officer who felt her body in inappropriate ways, without cause, but because she wasn’t raped, she didn’t feel like she should make a report.

In many communities, making a complaint about the police requires physically going to a police station during business hours, which serves as another deterrent. Forms for making complaints online are often hard to find and use. A Department of Justice report estimated that of the 6.5 million Americans who had negative interactions with police in 2011, the vast majority–93%–didn’t report what happened.

Even if someone does make a complaint, the police process of analyzing it is slow and opaque. “Oftentimes, it’s taking 12, to 18, to 20 months to hear back,” says Anderson. “The community, that entire time, knows nothing about how that person experienced police. They know nothing about how those experiences have shaped that community.”

The new chatbot, by contrast, is meant to be transparent. After someone answers a short series of questions–via Facebook Messenger, to make it easier to access than downloading yet another app–Raheem (Arabic for compassion, which the company says is what we need to build communities free of police violence) helps them decide if they want to make a formal complaint, and then adds the data to a dashboard. The dashboard, which is still in development, should be public within a couple of months. Anyone anywhere can use it, and as it aggregates data, communities will be able to see each incident mapped out over neighborhoods. And if a city government wants to reform its policing, Raheem can also provide it with custom data. It is currently in talks with several cities.

advertisement

“We want to make sure that [city governments] can do the work,” he says. “It’s up to them to make the change. We’re going to give them the tools they need in order to make that change, but they have to commit to this being a mission of theirs.”

Without good data, it’s difficult to design policies to improve policing. In a pilot during the summer of 2017, while Raheem’s team was part of Fast Forward, the nonprofit tech accelerator, the chatbot collected twice as much data in three months as the cities of San Francisco and Berkeley had in a year. One complaint, for instance, was from a 19-year-old black man who was pulled over and said officers gave no reason for the stop and didn’t ask his permission before searching him too forcefully. He didn’t make a formal complaint, but he did record the incident on Raheem. Because the tool is independent of police departments, it may be seen as more trustworthy, and therefore more likely to be used than official channels.

The team shared the tool at local legal clinics, and use spread by word of mouth. Many of those they met likely wouldn’t have made reports in the past. One man that Anderson spoke with said, “They’re not going to listen to me, man, I’m an immigrant.” He was also homeless. Anderson convinced him to take a flyer, and he ended up taking more with him to share at the homeless shelter.

“You had a homeless immigrant who felt like his voice didn’t matter,” he says. “We can’t possibly serve people like him without listening to him.”

The team, which raised $50,000 from an Obama-era initiative called My Brother’s Keeper, is currently raising another $250,000 to finish the chatbot and publish the dashboard, which will be available everywhere that people make reports. “It will be accessible to anyone,” Anderson says. “Your city government doesn’t have to have an interest in tracking police performance in order for you to have access to the public dashboard.”


Correction: This article previously misstated the amount of time of Raheem’s pilot and the amount of city data from San Francisco and Berkeley it was being compared to. Raheem gathered twice as much data in three months as the cities of San Francisco and Berkeley had in one year, not two months vs. two years, respectively.

About the author

Adele Peters is a staff writer at Fast Company who focuses on solutions to some of the world's largest problems, from climate change to homelessness. Previously, she worked with GOOD, BioLite, and the Sustainable Products and Solutions program at UC Berkeley.

More