Thursday, January 5, 2017

Propublica: Technology (1): ProPublica continues its investigation of '(A)lgorithmic injustice and the formulas that increasingly influence our lives.'..."ProPublica’s analysis of bias against black defendants in criminal risk scores has prompted research showing that the disparity can be addressed — if the algorithms focus on the fairness of outcomes."..."There’s software used across the country to predict future criminals. And it’s biased against blacks. When we looked at the people who did not go on to be arrested for new crimes but were dubbed higher risk by the formula, we found a racial disparity. The data showed that black defendants were twice as likely to be incorrectly labeled as higher risk than white defendants."


STORY: "Bias in Criminal Risk Scores Is Mathematically Inevitable, Researchers Say," by reporters Julia Angwin and Jeff Larson, published by ProPublica on December 30, 2016 as part of an on-going investigation;

SUB-HEADING: ProPublica’s analysis of bias against black defendants in criminal risk scores has prompted research showing that the disparity can be addressed — if the algorithms focus on the fairness of outcomes

SUB-HEADING: Machine bias: "We’re investigating algorithmic injustice and the formulas that increasingly influence our lives."

GIST: "The racial bias that ProPublica found in a formula used by courts and parole boards to forecast future criminal behavior arises inevitably from the test’s design, according to new research. The findings were described in scholarly papers published or circulated over the past several months. Taken together, they represent the most far-reaching critique to date of the fairness of algorithms that seek to provide an objective measure of the likelihood a defendant will commit further crimes. Increasingly, criminal justice officials are using similar risk prediction equations to inform their decisions about bail, sentencing and early release. The researchers found that the formula, and others like it, have been written in a way that guarantees black defendants will be inaccurately identified as future criminals more often than their white counterparts. The studies, by four groups of scholars working independently, suggests the possibility that the widely used algorithms could be revised to reduce the number of blacks who were unfairly categorized without sacrificing the ability to predict future crimes.........There’s software used across the country to predict future criminals. And it’s biased against blacks. When we looked at the people who did not go on to be arrested for new crimes but were dubbed higher risk by the formula, we found a racial disparity. The data showed that black defendants were twice as likely to be incorrectly labeled as higher risk than white defendants."


 
The entire story can be found at:

https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematically-inevitable-researchers-say

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy; Publisher; The Charles Smith Blog;