Jailed by an Algorithm
Algorithms that are supposed to remove sentencing bias may perpetuate racism instead, say civil rights groups.
Nick Muscavage
TRENTON, N.J. — One day after being arrested and charged with possession of a weapon in November 2017, Brandon Darby, 51, was ushered into a room with three small cubicles, each with its own small television screen. A judge, communicating virtually through the screen, read Darby his charges and told him his score from New Jersey’s new risk-assessment algorithm. On a scale of one to six (six being the highest), Darby was labeled a four for risk of failing to appear in court and a five for risk of reoffending. Based on those, he was ordered held without bail until trial. It took no more than eight minutes.
In January 2017, New Jersey implemented its Criminal Justice Reform Act, which introduced risk-assessment algorithms to determine whether to jail a defendant before trial. The goal of the reform is to phase out money bail, to keep low-level defendants from sitting in jail simply because they cannot afford bail.
For people like Darby, though, bail is denied entirely. The algorithm, created by the Laura and John Arnold Foundation, considers a defendant’s past criminal convictions, past failures to appear in court and current charges of violence, among other factors.
Darby insists he has never missed a court date. “The data is skewed,” he says. “We all know that we can make numbers say whatever you want them to say.”
He eventually pleaded down to possession of cocaine and his gun charge was dropped. He was released on time served after spending six months in jail with no trial or conviction.
Darby is one of many people to raise concern over risk-assessment tools, which have become a popular add-on to measures abolishing money bail, such as a California law signed in August 2018 and a federal bill proposed in 2017 by Sens. Kamala Harris (D-Calif.) and Rand Paul (R-Ky.).
Pennsylvania, on the other side of the Delaware River from New Jersey, is one state considering a risk-assessment tool. Unlike New Jersey, the algorithm would be used to make sentencing recommendations to judges. In June 2018, around the time Darby was being released from jail, J. Jondhi Harrell, 63, made his way to the Criminal Justice Center in Philadelphia to testify against the plan, along with dozens of others, at a public hearing.
Harrell served 25 years in prison for federal bank robbery and weapons charges. After his release in 2009, he founded the Center for Returning Citizens, an advocacy group focused on helping people return to society after incarceration. “It is essential that those most impacted by mass incarceration have some reason to believe that change for the better is coming. This is change for the worse,” Harrell testified.
“A computer risk algorithm can’t measure who you can become,” Harrell said. He calls this the “human element.”
Harrell also fears data based on racist policing practices will have racist outcomes. A 2016 ProPublica study of risk-assessment algorithms in Florida found that the scores were racially biased as well as “remarkably unreliable” in predicting violent crime.
In July 2018, more than 100 social-justice organizations, including the ACLU, the NAACP and MoveOn, released a statement of opposition to the algorithms, arguing that “the data driving many predictive algorithms — such as prior failures to appear and arrest-rates” reflects systemic biases against people of color. The statement recommended simply ending money bail and dramatically reducing pretrial detention, without introducing risk-assessment tools.
Proponents of New Jersey’s bail reform, including ACLU New Jersey, which helped craft the law, credit it with a 25 percent decrease in the state’s pretrial jail population since implementation in January 2017.
“We obviously think pretrial jail population should be down even more than that,” says ACLU-NJ staff attorney Alexander Shalom, “but this is a critical first step.”
Shalom says there is not yet enough data to determine whether the risk-assessment tool has resulted in racial disparities. “It’s a critical concern,” he acknowledges, but he argues that before the reforms, judges were looking at the same data points. “It would be a mistake to suggest that the problem is with using quantitative risk assessments,” Shalom says. “The problem is having a racist criminal justice system.”
He believes using pretrial detention algorithms in place of money bail is a step toward a “culture change” in the courts: “You need to get judges thinking about liberty, rather than thinking about money.”
Mark Bergstrom, executive director of the Pennsylvania Commission on Sentencing, which is tasked by the legislature with developing the sentencing tool, sees it as a step toward transparency.
Pennsylvania’s proposal would consider the offender’s convictions in addition to age, gender, type and number of prior convictions, type of offense and prior juvenile adjudications. Past arrests were removed in response to public concerns. “We’ve done everything we can to neutralize race and other factors like that,” Bergstrom says. He stresses that the risk score would only be a suggestion to judges, not the final say in sentencing.
After the June 6, 2018 hearing in Philadelphia and others around the state, the Pennsylvania Commission on Sentencing decided to delay its vote in favor of more hearings.
Wherever the hearings are held, Harrell will be there.
“We would ask that our sons and daughters, nieces and nephews, family and friends who transgress against society not be seen as unredeemable,” Harrell said in closing his June remarks. “We ask that the basic humanity of those who are charged with offenses against our community not be reduced to a number.”