Tuesday, June 12, 2018

Technology series (5): Statistician Kristian Lum - of the Human Rights Data Analysis Group - expresses candid views on artificial intelligence and the controversial use of predictive policing and sentencing programs. (in Bloomberg Businessweek)..."A selling point for a lot of people is that they think a computer can’t be racist, that an algorithm can’t be racist. I wanted to challenge the idea that just because it’s a computer making the predictions, that it would solve those problems."


PUBLISHER'S NOTE:  Technology series: This Blog has been increasingly drawn into the world of technology in terms of its ability to cause wrongful prosecutions, to provide the police with excessive powers,   to make decisions in courtrooms about matters such as bail and sentencing, and to impact on individual privacy and dignity. The series will also make clear that one should not evaluate technology such as Artificial intelligence - and the logarithms it is based on - solely within the confines of the criminal justice system. Artificial intelligence is quietly and quickly spreading into many aspects of our lives. We must be aware of the of the total impact on us as individuals and on our society as  well.

Harold Levy: Publisher; The Charles Smith Blog.

-----------------------------------------------------------

STORY: "The Data Scientist Helping to Create Ethical Robots," published by Bloomberg  Businessweek on May 15, 2018. Ellen Huet interviews Kristian Lum;

SUB-HEADING:  "Kristian Lum is focusing on artificial intelligence and the controversial use of predictive policing and sentencing programs."


GIST: "As the lead statistician at the nonprofit Human Rights Data Analysis Group, Kristian Lum, 33, is trying to make sure the algorithms increasingly controlling our lives are as fair as possible. She’s especially focused on the controversial use of predictive policing and sentencing programs in the criminal justice system...

You’re studying how machine learning is used in the criminal justice system. What drew you to that?
A few years ago, I read this really interesting paper published by a predictive policing company. We reproduced the model they had built and applied it to some real data to look at what the consequences would be. We applied our model to the police records of drug crime in Oakland [Calif.] and compared that to an estimate of the demographic profile of people likely to be using drugs based on public-health records. That comparison found that the police enforcement and recording of drug crimes was disproportionately taking place in communities of color. Then we applied the predictive policing algorithm to that data set and found it would perpetuate or perhaps amplify the historical bias already in that data. The move toward using AI, or quantitative methods, in criminal justice is at least in part a response to a growing acknowledgment that there’s racial bias in policing. A selling point for a lot of people is that they think a computer can’t be racist, that an algorithm can’t be racist. I wanted to challenge the idea that just because it’s a computer making the predictions, that it would solve those problems.

Is that a tough sell, the idea that a computer can be biased?
I feel like I can’t open Twitter without seeing another article about the racist AI. What’s hard about this is there isn’t universal agreement about what fairness means. People disagree about what fairness looks like. That’s true in general, and also true when you try to write down a mathematical equation and say, “This is the definition of fairness.”

The entire interview can be read at:
    https://www.bloomberg.com/news/articles/2018-05-15/the-data-scientist-helping-to-create-ethical-robots

    PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy; Publisher; The Charles Smith Blog.