Friday, May 26, 2017

Eric Loomis: Wisconsin; Technology; (Indiana Risk Assessment System): Judgment by algorithm; (Whatever happened to King Solomon? HL)...Eric Loomis just wants to know how, exactly, an algorithm determined he posed a high risk to the community."..."Sentencing him to six years of incarceration and five years of extended supervision, the trial judge pointed to Loomis’ history of recidivism and failure to take responsibility. Also, the court noted the result of an actuarial assessment tool that labeled him likelier to reoffend. Loomis filed a motion for post-conviction relief, arguing, in part, that his due process rights were violated because the court relied on the computer-based assessment program COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). A central question is whether the claims by COMPAS developer Northpointe Inc. — that the calculations were proprietary — prevented Loomis from challenging the scientific validity of the assessment. Now the Supreme Court of the United States is considering taking a closer look."


STORY: "Judgment by algorithm," by Marilyn Odendahl, published by  The Indiana Lawyer on May 17, 2017.

PHOTO CAPTION: "Eric Loomis just wants to know how, exactly, an algorithm determined he posed a high risk to the community."

GIST: "The Wisconsin man was charged in connection with a February 2013 drive-by shooting in La Crosse. He has continued to maintain he was home cooking dinner at the time of the incident, but he did plead guilty to two related offenses of trying to flee police and driving a motor vehicle without the owner’s permission. Sentencing him to six years of incarceration and five years of extended supervision, the trial judge pointed to Loomis’ history of recidivism and failure to take responsibility. Also, the court noted the result of an actuarial assessment tool that labeled him likelier to reoffend. Loomis filed a motion for post-conviction relief, arguing, in part, that his due process rights were violated because the court relied on the computer-based assessment program COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). A central question is whether the claims by COMPAS developer Northpointe Inc. — that the calculations were proprietary — prevented Loomis from challenging the scientific validity of the assessment. Now the Supreme Court of the United States is considering taking a closer look. Loomis filed a petition for writ of certiorari in October 2016 after the Wisconsin Supreme Court denied his motion for resentencing. In March 2017, the court invited the acting solicitor general to file a brief on the case. Risk assessments and algorithms are not new. They have been used routinely by criminal justice systems in many states to predict how likely the defendant or offender is to commit another crime.  Indiana has a set of such tools to help determine conditions for pretrial release, community supervision, prison intake, and re-entry. The Indiana Risk Assessment System scores an individual’s responses to a series of questions that range from criminal history and substance abuse to employment and social support Like the Wisconsin tool, the IRAS scores are based on algorithms. Both the Indiana Public Defender Council and the Indiana Prosecuting Attorneys Council expressed reservations about IRAS. The organizations note IRAS scores have not been validated as to their accuracy when applied to the Indiana offenders. Moreover, they worry judges will rely more on the numerical score than their own discretion. IRAS scores are actuarial and not individualized. The assessment tools put the defendants or offenders in certain groups based on shared characteristics. The IRAS-Pretrial Assessment Tool labels an individual as low-, medium- or high-risk, depending on the typical behavior of the group..........Noting risk assessment tools are not going anywhere, David Powell, executive director of the Indiana Prosecuting Attorneys Council, said they must provide a valid, accountable, objective analysis and the outcomes have to be measured.  Algorithms cannot replace personal judgment, he said, giving the example that IRAS does not consider the nature of the crime. A person charged with a heinous act might be deemed low-risk if it was the first crime he or she committed. Prosecutors who turn the decision-making over to the victims have been disciplined for transfer of discretion, and Powell sees the same possibility for running afoul of court rules when judges rely too much on IRAS. “Prosecutors are not big fans of IRAS, by and large,” he said. “The fear we have is the tool will be used in lieu of judicial discretion.”.........News reports, including a series done by ProPublica in 2016, have focused on the potential for bias against minorities built into these kinds of assessment tools. Landis described particular questions in the IRAS, which he said ask the age of the offender’s first arrest and whether the offender has any relatives in prison, as being racially biased."
The entire story can be found at:

http://www.theindianalawyer.com/judgment-by-algorithm/PARAMS/article/43719

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy; Publisher; The Charles Smith Blog;