3 minute read

The Measure of a Man – Why I Wrote the Story

 

My short story, The Measure of a Man, is based on fact and imagination. Early in 2018, I learned of a trend in the criminal justice system to use artificial intelligence. One form of artificial intelligence used is a computer generated risk assessment used for sentencing.  Alarmingly, some of those assessments are based on secret algorithms. I decided to write a story featuring this controversial practice.

Risk Assessments Based On A Secret Algorithms.

In The Measure of a Man, the judge relies on a computer generated report to sentence the two main characters to jail. The report claims to assess the risk that each of the defendants will be arrested again in the future.

In a real case in 2013, Eric Loomis was sentenced to jail in Wisconsin. The judge who sentenced Mr. Loomis relied substantially on a risk-assessment report produced by a company called Northpointe. The results in the report were generated by an computer program named COMPAS. The data and methodology of the algorithm used in COMPAS were deemed proprietary by Northpointe and kept secret. Neither Mr. Loomis, his attorney nor the judge who sentenced him knew the factors being used to determine the risk that Mr. Loomis was likely to commit future crimes.

The sentencing judge ignored Mr. Loomis’ plea deal and sentenced him to a longer sentence based primarily on the COMPAS risk assessment. Loomis appealed his sentence to the Wisconsin Supreme Court alleging that his constitutional rights to due process were violated because of the secrets surrounding the report. In 2017 the Wisconsin Supreme Court upheld the sentence but stated that due to the potential for error and bias in such secret artificial intelligence it should never be relied upon solely.

Research About Human and Artificial Intelligence Bias.

There has been a great deal of criticism of programs like COMPAS  due to the secrecy of its artificial intelligence. Well documented psychological research tells us that every person has hidden or implicit biases which affect each person’s decisions. Police, judges and lawyers all have biases.

One goal of artificial intelligence in criminal sentencing is to help eliminate the personal bias of judges. However, the programmers who create the algorithms are human and biased.  So, when the process is secret, there is no way to know the accuracy of the information used or the working of the algorithm that produces the risk assessment report.

Research also tells us that an algorithm, once written, will not only perpetuate any bias programmed into it but will also create its own biases. It uses the results it generated previously as a basis for its future computations, even results that are false. In the IT field this is called a “feedback loop.” In addition, there are a number of other biases which may occur in such programs.

The Future of Artificial Intelligence in the Judicial System.

To date the U.S. Supreme Court has not ruled on a case involving criminal sentencing using secret artificial intelligence. However, in 2017 Chief Justice John Roberts, while speaking at Rensselaer Polytechnic Institute, was asked this question: “Can you foresee a day when smart machines, driven with artificial intelligence, will assist with courtroom fact-finding, or, more controversially even, judicial decision-making?” Chief Justice Roberts answered, “It’s a day that is here and it’s putting a significant strain on how the judiciary goes about doing things.” Maybe the Supreme Court will take up this important issue soon.

The Measure of a Man explores the lives of two men and how human failure, a flawed system and secret, artificial intelligence can tragically twist their fates. Can any power on Earth save them?

Categories: Blog

Leave a Reply