Policing’s problems won’t be fixed by tech that aids—or replaces—humans

Gadgets such as Ring doorbells could lead to police work being increasingly automated. But it’s humane factors that can make communities safer for everyone.

Policing’s problems won’t be fixed by tech that aids—or replaces—humans
[Photo: Flickr user osseous]

The Ring doorbell deal between Amazon and police departments could be a first step toward privatizing and replacing the police force—or at least deeply changing how it functions.


Ring is a Trojan Horse. Amazon builds a network of surveillance cameras, offers deals to the police, who accept and include it into their portfolio of surveillance tools, and neighbors using Ring feel safer. But some police departments are claiming that Ring footage doesn’t help them much. Is that true? It could be. A Ring doorbell may help deter crime, but that impact cannot easily be measured.

Perhaps the police do not wish to consider how technology will eventually become smart enough to subsume most policing duties that involve identifying suspects and locating them.

These days, there is much for police departments to be concerned about. They have been murdering Black citizens and firing rubber bullets (with steel cores) at both citizens and the press. The Federal authorities aren’t immune, either: They’ve showed up in Portland and have been essentially kidnapping protestors off the streets near Federal buildings. While they may have a different agenda than the local police, it seems to be “Cops Gone Wild” lately, and that can’t all be due to racism. Something deeper must be stuck in their craw. Perhaps they are fighting to retain the current methods of fighting crime in a new socio-technical world that demands they learn data skills. Policing is no longer tactile, it’s tactical, and the clues are on the network.

Today, most policing doesn’t work the way one would imagine from Norman Rockwell’s illustrations or the days of the friendly police officer walking the beat. Officers aren’t trained long enough or with enough depth to be effective in handling the social services or mental health crises that cause problems in communities.  In fact, they aren’t prepared for much of the work that involves community.

Instead, they are trained to “enforce” the law, and over decades of police militarization, this has translated into a focus on obtaining military equipment and training to use it. Furthermore, officers are bound to technology in new ways—they wear cameras and they are dispatched by radio to where they are told to go, all while connected to constant radio reporting of crimes and events in neighborhoods.


In many ways, police are apart from the communities they serve. They mostly drive through areas they don’t live in. They participate in private police networks and unions. They don’t know people, and the internal culture they are part of does not reflect the culture of the broader communities they patrol.

Civilians are aware of the law, but they are also trying to make a living, get by, get help, celebrate joys, grieve, and experience any number of human events and emotions in their lives. If we all were to be law abiding at all times, we’d probably all have to stay home at all times. The law isn’t set up to help us in all cases, and being of an “othered” ethnicity can mean that one could be arrested for pretty much anything.

Policing via screen

If the Ring doorbell isn’t a boon to policing, why are so many police departments adopting it, and referring and recommending other police departments to adopt it? 

I expect that many police departments are adopting the Ring doorbell because its vast data collection in neighborhoods can seem like it makes their jobs less risky. Instead of being tethered to their cars to catch thieves—putting their lives (and the lives of others) at risk—police officers can focus on their screens, using digital methods to track criminals. This does not stop the police being present for truly threatening real time events, but it does change the way that crime could be processed.


In a recent, dramatic example of digital policing, international law enforcement in the U.K. and Europe secretly monitored a global phone network, Encrochat, by installing malware. It then used knowledge gained from over a hundred million encrypted messages to bust a multi-country criminal network, arresting hundreds and confiscating drugs, guns, and money. This was not an on-the-street activity but rather happened on the network, where the data police obtained enabled them to understand a large swath of criminal activity, previously unseen. The criminals were arrested in person. 

Officers are dispatched into communities where they pattern-match people like bad algorithms.

But policing that happens online rather than on the street introduces new problems. In the absence of learning about communities by talking to the people who live there and getting to know them, police departments have gone online to read community posts on Twitter, Facebook, Instagram, and other social media. Furthermore, Nextdoor has worked closely with police departments to encourage their use of its platform, although it doesn’t allow their official accounts to read anything other than replies to their posts, and recently eliminated a feature that let users forward their posts to the police.

Social posts represent only parts of people—really, parts of parts of people—and do not reflect the true lives or tone or even a majority of the community. Police who are not trained in data science or analytics are using the information from these sites to form opinions about communities and community members which they then apply to individuals they profile.

Rumors and social media posts that create false impressions in the minds of police officers are dangerous. People who complain about their fears of Black people on social media can lead to Black people getting arrested or even killed for minor crimes or over false accusations. What if the problems instead were fear of Black people, social posts prompted by that fear, and rumors about Black people being a problem?

Officers are dispatched into communities where they pattern-match people like bad algorithms, looking for the “other.” They don’t realize that they themselves are the “other,” the people who do not live in the neighborhood, and do not look or talk like them. Thus, social media postings build up a threat model and profile of an “enemy” that must be brought down. There seems to be no room for moderation in the current policing we are seeing unfold in real time in our cities. If your only tool is a hammer, even a violin vigil and a peaceful protest can look like nails to be pounded. 


Silver-platter arrests

When crime tracking and policing becomes more technical and involves planting malware and conducting data analysis, or reviewing tapes and running facial recognition on suspects, the police won’t have much more to do than arrest people. Already the police have algorithmic and neighbor assistance. In a recent article in The Atlantic, Captain Una Bailey of the San Francisco Police Department was quoted referring to a citizen who had done background surveillance and evidence gathering on a suspect, “They had taken all these steps that basically hand us an arrest on a silver platter with all the evidence … I’m definitely an advocate of people installing cameras.” While the article reminds us that “San Francisco’s police department hasn’t participated in surveillance-device subsidies, the Neighbors app, or bait box operations, but has joined Nextdoor,” the citizens in its jurisdiction are doing its surveillance work and evidence collection.

Eventually, a significant proportion of police work could be automated rather than performed by humans. Systems that identify criminals could link to IoT technologies to do things such as shut off the utilities at their residences or lock the software in their vehicles—making their lives so miserable that they’ll have no other recourse but to turn themselves in. That would be much easier and less stressful for people than in-person police confrontations.

However, such automation would come at a vast cost to the community in terms of privacy, human rights, and a forfeiture of civil liberties, which many do not wish to grant to the police. This is especially concerning since algorithms such as facial recognition can wrongly identify people, resulting in an automated arresting nightmare for innocent citizens who would rather not interact with the police in any way, lest they too be shot by accident.

The police embracing Ring may not see technology coming for their jobs, and themselves as replaceable. However, policing should be a social activity, and police officers are currently perceived as not being social with the communities they are ostensibly serving. Many citizens think the police have a vendetta against some members of society, often targeting members of those groups who have done nothing wrong but are perceived as different from societal norms because of their ethnicity or culture.


Four steps toward a safer future

In some ways, replacing the police with technologies such as Amazon Ring and robots could save human stress and trauma for law enforcement. But it will create many more problems for the rest of us, all of the sort we see with automation in general. Algorithms aren’t robust, or even correct, and facial recognition isn’t reliable. It’s wrong much of the time for nonwhite ethnicities and for women and children. This makes it biased and dangerous, and when enforcement is automated, potentially even more fatal than law enforcement currently is (to some).

Automating more of the police force will result in similar problems. Algorithms are not social. They are designed to operate in ways that are not human or humane, and no programmer or data scientist could ever anticipate all human situations in the context of policing.

Related: The creeping threat of facial recognition

However, there are things that can be done to improve policing now. These proposals are intended to augment the current push toward defunding the police, replacing them with social services for appropriate social and mental health calls, along with other suggestions that are starting to be implemented.

  1. Social media companies need to develop more robust ways to monitor instances of bias and prejudice on their platforms. This is where these rumors are being created, and where biased posts are being read by police officers, who are being indoctrinated by them.
  2. Pair up officers in patrol cars with one to handle the communications and the other to drive and observe a situation. One of them needs to be seeing a community and not hearing about it on a radio.
  3. Police officers need to spend time in the communities they patrol, outside of their cars, getting to know individuals and the patterns of the neighborhood. Building relationships within the community builds trust and gives officers knowledge about their beat that is not easily transmitted over the radio. That knowledge has to be in the officers’ lived experience memory before an incident.
  4. Lastly, more thorough evaluation and identification of trauma illness in police officers—and developing tools and methods to address it—could help provide support. The officers in their patrol cars are lone wolves connected only by radio. They require sociability and interaction outside of their insular networks as much as any other human. A more socially connected police force, both to other officers and to the community, could have large positive dividends for communities.

Amazon and its Ring doorbell are not social. They foster isolation. When we pass the doors of homes with Ring installed, we’re recorded without our permission or knowledge, by strangers who do not know us and do not trust us. When we don’t know our neighbors, we merely record them. Thus we become bound to our homes in neighborhoods in the same way that current police officers are bound to their cars, without true knowledge of who we are in our communities, and without relationships to each other.


All of humanity depends upon social cooperation to survive. When we stop being social and cooperating with each other, we create conflict, and extreme conflict can cause us to harm each other. In the case of these police officer murders of innocent citizens, the sociability and cooperation are gone. No amount of technology can solve this problem. If we retain the idea of the police, then police departments must retrain officers and put them in social working partnerships to give citizens a sense of peace and well-being in their communities.

S. A. Applin, PhD, is an anthropologist whose research explores the domains of human agency, algorithms, AI, and automation in the context of social systems and sociability. You can find more at @anthropunk and