Risk assessment in the courtroom: Can algorithms discriminate against poor defendants?
Have you ever stopped to consider just how inescapable algorithms are in our daily 21st century lives? They tell us the fastest route to our destination, show us things that we might find interesting, and highlight products for us. Would it shock you that the same algorithms can inform decisions in the criminal justice system? Risk assessment is the practice of using algorithms or other evidenced-based checklists (often referred to as a risk assessment tool or instrument) to estimate a person’s likelihood of future offending. Policymakers and justice officials have become increasingly interested in the use of risk assessment to identify and release low-risk offenders as a possible solution to the mass incarceration problem.
Risk assessment has been found to be more accurate at predicting reoffending than judicial intuitions. Using algorithmic decisions in place of judicial decisions could reduce crime by up to 25% without changing incarceration rates. Consequently, advocates argue that judges should consider risk assessment when making determinations about sentencing and release. These advocates suggest that using formalized risk assessment measures improves consistency, transparency, and accuracy of decisions.
Although risk assessment is promising, little research has examined whether risk assessment increases disparities in sentences. In Impact of risk assessment on judges’ fairness in sentencing relatively poor defendants, Skeem, Schurich, and Monahan (2020) investigated whether defendant socioeconomic status (poor or wealthy) and presence of risk assessment information influenced judicial sentences. Results indicated that risk assessment information increased sentencing disparities. Without risk assessment information, the poorer defendant was less likely to be sentenced than the wealthy defendant; with risk assessment, the poorer defendant was more likely to be sentenced than the wealthy defendant.
In many jurisdictions, risk assessment information is routinely included in presentence investigation reports. Even when judges explicitly discredit/reject risk assessment, risk scores could influence how they determine a sentence. This is concerning given that with the advancement of technology, we want our justice system to begin moving away from these sentencing disparities, not toward them. Thus, further research should delve into other sentencing disparities, which might also be affected by risk assessment as well as how to present risk information so judges can more effectively and fairly incorporate them into their decision-making. Furthermore, it is important to consider the potential biases that can be ingrained in the creation of algorithms in order to create more unbiased risk assessments; algorithms are as biased as the data collected and used to create them (i.e., algorithms can carry the bias of their human creator). Additionally, it may not matter how objective an algorithm is if the support and living conditions of minorities do not improve – socioeconomic disparities create risk factors that lead to greater recidivism.