Prisoners Mugshots

AI: Artificial Incarceration

Group 31

Incarceration is a racial issue globally but its effects are most significant in America, where around 2 million citizens are behind bars, and a disproportionate number of these are African American. Big data analytics, AI, and machine learning have revolutionised countless industries and businesses, providing a more rigorous and powerful statistical approach to decision making by objectively interpreting data better. One such application is the American justice system where a company named Northpointe have developed an algorithmic model (COMPAS) to inform judges about recidivism (committing a crime once let out of jail) likelihood when making decisions about sentence length, using a questionnaire based on a forecast of a criminal’s likelihood of recidivism. Though many proponents of this method claim that it removes racial biases of judges, which often leads to African Americans receiving harsher sentences for similar crimes, the algorithms have been found to discriminate based on race.

Algorithms in theory provide a purely data based recommendation for incarceration duration. By not including race as an input, it seems intuitive that the algorithm should not discriminate based on race. The data are generated through 121 questions taken from conviction data and a self-fill questionnaire which ask many questions that could encode race, including area of residence and number of friends in jail as well as qualitative questions about general anger and wellbeing. It has been found that these algorithms often recommend more severe punishments for black offenders with clean criminal records who have committed non-violent crimes than for white offenders with past criminal history involved in violent crimes.

A key incentive behind the use of a statistical approach is to make recommendations purely on the basis of quantitative data to hopefully avoid any institutional bias against an offender. However, reading through an example of the questionnaire used to generate data for the algorithm, it becomes clear that many of the questions are entirely subjective. For example, one question, “Based on the screener’s observations, is this person a suspected or admitted gang member?”, is clearly influenced by the bias of the questionnaire administrator. As such an implementation of the questionnaire in which questions are limited to factual statements would improve the impartiality of the process.

“Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice,”  

U.S. Attorney General Eric Holder, 2014

Although the act of committing a crime isn’t in accordance with the freedom principle, offenders have a fundamental right to be sentenced appropriately for their crimes. The biases displayed by the COMPAS tool, uncovered by a pro publica investigation, removes the principle of freedom for black offenders, unacceptable in a liberal democracy.

Continued and expanded use of this method of sentencing could have knock on consequences for society as a whole. The developed world has a growing problem of increasing numbers of prisoners, especially in America where its costs are felt both socially through the fabric of the country and have a material cost estimated to be around $1.2Tn. There is a risk inherent in the algorithm that by using past data, sentencing decisions could be resistant to penal reform. Were there to be a big push forwards to rehabilitate black prisoners, it would take a long time for the changes to affect the dataset and they could continue to be disproportionately sentenced.

The idea behind the process is a virtuous one, which could be used in a utilitarian approach to justice. Statistics based decision making can reduce the risk to society by removing the risk of human errors of judgement with regards to early release. Furthermore, this approach which focuses on significant social improvements over political rhetoric could foster a transition from penal populism, which has caused a rise in prison populations across the developed world, back to an agenda of rehabilitation which has been proven to reduce overall crime rates, fitting into Kantian universalisability as it has good consequences.

Graph Showing Re-Offending Rates
Click to Enlarge

In a specific rebuttal to criticism of their methods, Northpointe published a report defending their COMPAS tool against accusations of bias, claiming that opponents ‘’Used the incorrect classification statistics”. Instead, they should have taken into account racial base rates of recidivism. Analysis carried out by this publication, which compared recidivism rates in Maryland when including and excluding race from the algorithm, found this to be insignificant.

There are a number of improvements which could be made to the system in conjunction with judicial reform to enable algorithm enhanced sentencing to be a fairer success. As the algorithms must use past data to make future predictions, they learn from a system which is already steeped in bias. For instance, if biased judges have been sentencing young black males disproportionately harshly compared to the rest of the population then statistical methods may continue this trend, especially as increased sentencing length increases the rate of recidivism.

One solution includes modifying the questionnaire to have more objective questions or remove “self-reporting” which have a clear potential to encode race. Another possible solution would be to estimate a “race” factor by which minorities have been disadvantaged and apply a correction factor to the sentencing length suggestion or build a regression model to “regress off” race as a category in the data. However, in an age of penal populism where politicians must seem tough on crime, it seems unlikely that this will occur.

Conclusion

Whilst the aims of the company – to remove racial biases from sentencing decisions – seem undeniably good, the reality for many black Americans has been simply a continuation of systematic bias at the hands of the judicial system. This serves to remind engineers that although their ideas may seem virtuous at first, they may have unethical consequences, which can be countered when considering the wider effects of their designs.

12 thoughts on “AI: Artificial Incarceration

  1. This is, frankly, a terrifying development. I appreciate don’t find Northpointe’s counter-argument persuasive but would like to know how the reoffending accuracy delivered more false positives or false negatives and whether you (and Northpointe, and the justice system) think this makes a difference? Is Blackstone’s formulation built into this algorithm? “It is better that ten guilty persons escape than that one innocent suffer”.

    1. The algorithm didn’t actually decide whether people were guilty or not guilty – only a jury can do that! Judges decide the sentence upon a guilty verdict and the tool informs their judgement by giving a risk of re-offending (which we think strongly encodes race). Were there to be some sort of predictive capacity for informing a jury on guilt, that would surely be problematic!

  2. Statistics and algorithms are indeniably useful, especially for engineers and the like, but I think the judicial system is an area that requires more than a mere score out of 10 to evaluate apparent risk, regardless of how thorough the questioning is. The judicial system is no where near perfect though, and reform is paramount. Perhaps similar methods could be employed in conjunction with the system?

    Northpointe argued that “when judges see a defendant’s risk score, they need not consider the defendant’s race when interpreting it” – is this the best way? Maybe we should consider the defendant’s race. When you consider that the American government throughout history has shown to be racist towards African Americans (the habitual offender law, amongst many many others), I feel as though the issue lies with the rigged system of locking up African Americans; and not with ensuring an apparent “bias-free” statisical approach.

    1. We are very much of the opinion that informing judicial decisions with statistics is a noble and useful activity. The Northpointe implementation is counterproductive due to its lack of transparency, methodology and crude output.

      Great point – one of the big problems in penal policy at the moment is a big shift from rehabilitation to populist policies which make reform very unlikely. If you’re interested Nicola Lacey talks about it extensively in this lecture https://socialsciences.exeter.ac.uk/media/universityofexeter/collegeofsocialsciencesandinternationalstudies/lawimages/hamlyntrust/Lacey.pdf One big problem is that the best way to make a non-violent person into a violent offender is to lock them up (the longer behind bars, the higher chance of offence). Considering that such a large proportion of young black men are currently behind bars, (stemming from systemic racism and a resultant vicious cycle), perhaps it would be prudent to consider race in the sentencing process and stop locking up young black men for petty/small crimes!

  3. We are very much of the opinion that informing judicial decisions with statistics is a noble and useful activity. The Northpointe implementation is counterproductive due to its lack of transparency, methodology and crude output.

    Great point – one of the big problems in penal policy at the moment is a big shift from rehabilitation to populist policies which make reform very unlikely. If you’re interested Nicola Lacey talks about it extensively in this lecture https://socialsciences.exeter.ac.uk/media/universityofexeter/collegeofsocialsciencesandinternationalstudies/lawimages/hamlyntrust/Lacey.pdf One big problem is that the best way to make a non-violent person into a violent offender is to lock them up (the longer behind bars, the higher chance of offence). Considering that such a large proportion of young black men are currently behind bars, (stemming from systemic racism and a resultant vicious cycle), perhaps it would be prudent to consider race in the sentencing process and stop locking up young black men for petty/small crimes!

  4. You say: “a disproportionate number of these (2 million US prisoners) are African American.” I agree with your statement, having read a number of articles on this matter. However, could you give a figure please? As a rough rule, we’d expect the percentage of African American prisoners to be similar to the percentage of African Americans in the overall US population (~12.6%). Giving this figure strengthens your article. As I say, I recall seeing that the percentage of African American prisoners is significantly higher.

    Edit: It’s 37% according to:
    https://en.wikipedia.org/wiki/Incarceration_in_the_United_States

    As tempting as it is to say that social background may have an influence the comment that: “these algorithms often recommend more severe punishments for black offenders with clean criminal records who have committed non-violent crimes than for white offenders with past criminal history involved in violent crimes” is alarming!

    Is this due to built-in bias or as a consequence of the questionnaires used by the algorithms? I’d like to understand the reasons for this observation.

    This is a fascinating article. Thank you.

    1. We generally feel that the bias is in no way an intended development of the algorithm creators merely that in the way it is implemented that it is an unavoidable consequence. Taking for example the known statistic that a greater proportion of African Americans are incarcerated than White Americans. As such if the algorithm uses as an input the address of the defendant and the defendant lives in a neighbourhood primarily populated by African-Americans then the algorithm will workout: people from this neighbourhood are more likely to commit crimes. The issue being that any questionnaire details which give any indication as to the race of the defendant even indirectly as in this example, will result in a reinforcement of the current state of affairs.

  5. Very thought provoking article. Law, like many ethical areas, is a very grey area where there is very rarely a right or wrong answer. For AI this could pose a problem as it will just calculate a 1 or a 0.
    Looking at crime an individual is more likely to commit a crime of they are from a poorer background, would looking at these number give a different view on crime as a whole.
    For law the cases are done on a case by case basis and the judge could give harsh sentences for a number of reasons. I believe before an AI can be properly implemented the law system needs reviewing so crimes and punishments are better defined and remove some of the grey area.

  6. This is such an interesting article yet bit terrifying. I was not aware of this development in the technological/law fields.
    Although I agree that the company had good intention when they created the algorithm, I don’t think this will reduce the unfairness of the judicial system in the long term. I do agree with the fact that the company should change the questionnaire to more objective questions, because questions like the one mentioned “Based on the screener’s observations, is this person a suspected or admitted gang member?” should definitely not be part of it, as it is clearly subjected to bias.
    About the use of AI, I’m still on the fence about the implementation of this algorithm, because I do believe that the guilty/not guilty decision should be made by a jury.

  7. V interesting. Good that there might be potential to remove the human bias from a judiciary decision and the inherent flaws in the justice system.

  8. It seems non-sensical to suggest that the AI provide a ‘race’ factor to account for a disadvantaged background experienced by African Americans. By this logic, a factor should be given to reduce sentences for women, latin americans, those with low IQ, low income groups and potentially left-handed people. The role of the judicial system is to protect the citizen to the best of its ability, not to cater for the needs of the convict. This is also a backwards approach. Surely we should be looking to create strategies to reduce the discrimination and disadvantages experienced by African Americans?

  9. Firstly, the only person who can wholly predict with certainty that an offender will re-offend is that offender them-self. It is difficult to determine whether data on social conditions of offenders could predict levels of recidivism. The use of self-fill questionnaires is important, but would a convict shed light on the possibility of re-offending considering they are incarcerated? Social conditions alone do not constitute crime, they may statistically increase the possibility of committing a crime, but it is the person, their decisions and opportunities presented to them that facilitate the act. The thought process of a person will need to be accurately predicted before we can safely say that there will be a chance of recidivism. The film that springs to mind is the ‘Minority Report’. Arguably, the ability to read a persons thought processes and incarcerate before the committing of a crime, holds a higher moral authority than changes to decision making ‘about sentence length’ using this ‘questionnaire based on a forecast of a criminal’s likelihood of recidivism’.

    This data analysed by Northpointe serves nothing else but to show the institutional social injustice in America. This data should be used to target those who may be likely to re-offend and improve their social conditions, not to determine the length of conviction. If the data set signifies that it’s the gang members that are more likely to offend, then the penal system should focus on initiatives to reduce gang membership. It is a shame that these algorithms are being used ‘unethically’ to further swell the prison populations, it’s not a way to improve society as a whole, but one to fill the pockets of the prison corporations. Or rather, the lack of incentive to spend money increasing the living conditions in America, that currently facilitate the movement of crime.

Leave a Reply