News

Illustration by Katie McBride

For evidence that predictive algorithms are now part of our daily lives, think no farther than Netflix recommendations, your Facebook feed, and the number of times you’ve corrected autocorrect. Now ask yourself: Would you be comfortable with an algorithm determining your fate in the criminal justice system?

UR law professor Erin Collins thinks the answer should be no and has been drawing attention to the growing frequency with which judges are drawing on analysis generated by algorithms when deciding the fates of guilty defendants, a practice known as actuarial sentencing.

“I think it’s attractive for a lot of reasons,” said Collins, who teaches courses in criminal law. “It seems to be objective. It seems to be kind of unassailable if it’s based on an empirical analysis. How could that be wrong?”

There are many ways, she argues in an essay published recently by The Crime Report, starting with the fact that the courts are putting the algorithms to use in an “off-label” kind of way “that undermine[s] the fairness and integrity of our criminal justice system.”

The tools being used in actuarial sentencing decisions were developed to help corrections officers decide how best to aid rehabilitation as they administer punishment. The tools rely on factors unrelated to an individual’s conduct, such as gender, education history, and family criminality, some of which “are markers of relative structural disadvantage and reflect historically biased criminal justice practices,” Collins writes.

For judges to rely on them during sentencing is to “defy the well-established tenet that we punish someone for what they did, not who they are.”