advertisement
advertisement
advertisement
  • 05.23.16

Report: The algorithms used to predict crime are racist and inaccurate

Many courtrooms in the U.S. use “risk assessment” scores calculated by computer programs to try and determine how likely an offender is going to commit another crime in the future. But the scores, which have impacted judicial decisions for years, are not very accurate in their predictions and are biased against African Americans, according to an analysis by Pro Publica.

The team of journalists found that not only were the scores “remarkably unreliable in forecasting violent crime,” but also that the algorithm was more likely to wrongly label black defendants as probable re-offenders than white defendants.

You can—and should—read the entire analysis hereCGW