Algorithms were supposed to reduce bias in criminal justice, but do they?

Algorithms were supposed to remake the American justice system. Championed as dispassionate, computer-driven calculations about risk, crime, and recidivism, their deployment in everything from policing to bail and sentencing to parole was meant to smooth out what are often unequal decisions made by fallible, biased humans.

This article is brought to you by Phys.Org.