In February 2013 a Wisconsin man, Eric Loomis, became the first criminal to receive a sentence strongly influenced by artificial intelligence (AI). Loomis was determined high-risk by COMPAS, the risk-assessing algorithm, and subsequently sentenced to 6 years in prison. The sentence was followed by an appeal, arguing that the risk score violated human rights due to the lack of transparency of the private company’s algorithm. The appeal was rejected.
Following the Loomis case, COMPAS’s algorithm had aided in the sentencing of more than 7,000 US arestees by 2016, drastically reducing the time and cost of legal trials. Now many argue that the algorithm is highly biased and causing unfair sentencing. Should these algorithms become a vital legal tool, or are they adding to the corruption of our prosecution systems?
The Good of the People?
The long term happiness of the population favours the implementation of AI sentencing software, supporting the technique from a Utilitarian standpoint. The improved turnaround time of cases by implementing the software will more efficiently advance criminals to a path of reformation, resulting in them contributing beneficially to society. AI will also decrease a defendant’s time in holding, benefiting the general public by saving taxpayers money. Alongside the immediate reduction in process time, a long term implication of the software will lead to future innovations within the field, with the goal of a fully automated judicial system. This reaps the benefits for the masses by reducing trial costs, as well as removing the need for tedious jury service.
It could be argued however that majority happiness is not achieved in the short term as high capital costs could be seen as undesirable. The software has also been incorrect in many cases, where criminals determined as low risk have quickly reoffended and criminals judged as high risk have then gone on to never reoffend. This raises doubts about validity and consistency of such unreliable software, as well as the morality of introducing it to such an important role. Therefore, it cannot be said that introducing the software provides happiness to the majority of the population, as there is possibility for truly high risk criminals to serve shorter sentences and return to the streets quicker. This is dangerous for the general public and unjust on those criminals deemed wrongly as high risk.
Stickler for the Rules
AI assisted sentencing can refer to a database of all previous trials, to which the trial in question could be compared against; this would provide the fairest, most moral result, by applying standards and regulations to uphold Kant’s moral law. The necessary legal teams can only recall details from a number of cases limited by the memory of the individuals. By collating historic cases of a similar nature the system could remove the sentimental or prejudice bias frequently shown by judges. The sentencing software will also help nullify the effects of briberies to judges in countries rife with corruption, as the software cannot be influenced by humans in court.
The underlying dataset can be argued to be morally questionable. According to reports, judges issue on average 19.1% longer sentences to black men than to white men who have committed the same crime. If the databases provided to the AI already show an overwhelming bias to give a black people a longer sentence, this racial prejudice could be perpetuated as the AI can only refer to previous cases. This in effect is social and racial oppression to a large percentage of the population.
A problem also arises when a case has no obvious similarities to any previous cases, decreasing the reliability of the software. AI cannot reliably understand the inner workings of the human brain and a person’s true nature after asking 137 multiple choice questions. The software could also be said to be immoral as COMPAS, a private company, has developed the software in such a way that makes it impossible to understand how the sentence was constructed. You could argue that the software violates aspects of Kantism as the nature of the code makes it impossible to understand how the risk score was determined, all that is known are the inputs and outputs. Every criminal has the right to understand the construction of their sentence in terms of criminal history, social background and character etc., something that this software does not allow.
The Facts, the Whole Facts, and Nothing but the Facts
Part of the current sentencing process involves assessing an individual’s potential to reoffend, based upon their character predating the crime. This often causes public outcry as first-time offenders of violence receive reduced sentences as little is known about them. An AI system could analyse the background information to only use facts that directly relate to the law, omitting irrelevant information like personal details, for example.
The sentencing algorithm can also be considered as a violation of virtue ethics. If virtue ethics is the focus of the behaviour of an acting person, how could a non-sentient machine or software possibly have the ability to attain any moral characteristics? As the software doesn’t have this ability of developing desirable traits, it cannot be deemed as moral if the code has no knowledge of what a moral decision actually is.
Northpointe, the company that developed the COMPAS algorithm, are completely within their right to develop the software as they please, as are the government to adopt and use the software. However, implementing the software could have an adverse effect on the rights of the people subject to its decisions. Requiring for this ethical argumentation. Let us consider a defendant who has been judged high risk by the software after committing a purely accidental crime. The subject has a clean criminal record, good character and a good background. By not allowing the arrestee to know how the sentence has been constructed, their wellbeing will be significantly damaged.
It is obvious that AI sentencing software is the T-1000 of the criminal prosecution system – a technique that has to be terminated.