Incarceration is a racial issue globally but its effects are most significant in America, where around 2 million citizens are behind bars, and a disproportionate number of these are African American. Big data analytics, AI, and machine learning have revolutionised countless industries and businesses, providing a more rigorous and powerful statistical approach to decision making by objectively interpreting data better. One such application is the American justice system where a company named Northpointe have developed an algorithmic model (COMPAS) to inform judges about recidivism (committing a crime once let out of jail) likelihood when making decisions about sentence length, using a questionnaire based on a forecast of a criminal’s likelihood of recidivism. Though many proponents of this method claim that it removes racial biases of judges, which often leads to African Americans receiving harsher sentences for similar crimes, the algorithms have been found to discriminate based on race.
Algorithms in theory provide a purely data based recommendation for incarceration duration. By not including race as an input, it seems intuitive that the algorithm should not discriminate based on race. The data are generated through 121 questions taken from conviction data and a self-fill questionnaire which ask many questions that could encode race, including area of residence and number of friends in jail as well as qualitative questions about general anger and wellbeing. It has been found that these algorithms often recommend more severe punishments for black offenders with clean criminal records who have committed non-violent crimes than for white offenders with past criminal history involved in violent crimes.
A key incentive behind the use of a statistical approach is to make recommendations purely on the basis of quantitative data to hopefully avoid any institutional bias against an offender. However, reading through an example of the questionnaire used to generate data for the algorithm, it becomes clear that many of the questions are entirely subjective. For example, one question, “Based on the screener’s observations, is this person a suspected or admitted gang member?”, is clearly influenced by the bias of the questionnaire administrator. As such an implementation of the questionnaire in which questions are limited to factual statements would improve the impartiality of the process.
“Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice,”
U.S. Attorney General Eric Holder, 2014
Although the act of committing a crime isn’t in accordance with the freedom principle, offenders have a fundamental right to be sentenced appropriately for their crimes. The biases displayed by the COMPAS tool, uncovered by a pro publica investigation, removes the principle of freedom for black offenders, unacceptable in a liberal democracy.
Continued and expanded use of this method of sentencing could have knock on consequences for society as a whole. The developed world has a growing problem of increasing numbers of prisoners, especially in America where its costs are felt both socially through the fabric of the country and have a material cost estimated to be around $1.2Tn. There is a risk inherent in the algorithm that by using past data, sentencing decisions could be resistant to penal reform. Were there to be a big push forwards to rehabilitate black prisoners, it would take a long time for the changes to affect the dataset and they could continue to be disproportionately sentenced.
The idea behind the process is a virtuous one, which could be used in a utilitarian approach to justice. Statistics based decision making can reduce the risk to society by removing the risk of human errors of judgement with regards to early release. Furthermore, this approach which focuses on significant social improvements over political rhetoric could foster a transition from penal populism, which has caused a rise in prison populations across the developed world, back to an agenda of rehabilitation which has been proven to reduce overall crime rates, fitting into Kantian universalisability as it has good consequences.
In a specific rebuttal to criticism of their methods, Northpointe published a report defending their COMPAS tool against accusations of bias, claiming that opponents ‘’Used the incorrect classification statistics”. Instead, they should have taken into account racial base rates of recidivism. Analysis carried out by this publication, which compared recidivism rates in Maryland when including and excluding race from the algorithm, found this to be insignificant.
There are a number of improvements which could be made to the system in conjunction with judicial reform to enable algorithm enhanced sentencing to be a fairer success. As the algorithms must use past data to make future predictions, they learn from a system which is already steeped in bias. For instance, if biased judges have been sentencing young black males disproportionately harshly compared to the rest of the population then statistical methods may continue this trend, especially as increased sentencing length increases the rate of recidivism.
One solution includes modifying the questionnaire to have more objective questions or remove “self-reporting” which have a clear potential to encode race. Another possible solution would be to estimate a “race” factor by which minorities have been disadvantaged and apply a correction factor to the sentencing length suggestion or build a regression model to “regress off” race as a category in the data. However, in an age of penal populism where politicians must seem tough on crime, it seems unlikely that this will occur.
Whilst the aims of the company – to remove racial biases from sentencing decisions – seem undeniably good, the reality for many black Americans has been simply a continuation of systematic bias at the hands of the judicial system. This serves to remind engineers that although their ideas may seem virtuous at first, they may have unethical consequences, which can be countered when considering the wider effects of their designs.