STORY: "Decision -making-algorithms: Is anyone making sure they're right," by reporter Michael Kassner, published by Tech Republic on July 8, 2016.
SUB-HEADING: "Is it wise to trust important decisions to an algorithm that might not been validated by independent parties?"
GIST: "Data-driven algorithms are making decisions that affect many aspects of our lives, and that may be a problem. "While there may be efficiency gains from these techniques, they can
also harbor biases against disadvantaged groups or reinforce structural
discrimination," writes Nicholas Diakopoulos, assistant professor of
journalism, University of Maryland, in his The Conversation piece
We need to know the algorithms the government uses to make important decisions about us. "The public needs to understand the bias and power of algorithms used in the public sphere." The bias and potential for error Diakopoulos alludes to tends to
slip under the radar until an algorithm-based decision negatively
impacts individuals or organizations. That concerns Diakopoulos because
of the following: Data-driven algorithms are used to massage
massive amounts of data into usable information. However, the data is
messy, and the processing even more so. That being the case, how does
one know if the results are accurate and trustworthy? Individuals
willing to take the time to validate the output from an
algorithm-driven system quite often run into problems due to the lack of
transparency. As to why that is: Developers are not willing to provide
what might be considered trade secrets and proprietary software to third
parties; To give credence to his concerns, Diakopoulos looked into how law
enforcement uses data-driven algorithms. "Last year (2015)
the federal government began studying
the pros and cons of using computerized data analysis to help determine
prison inmates' likelihood of reoffending upon release," writes
Diakopoulos. "Scoring individuals as low-, medium-, or high-risk can
help with housing and treatment decisions, identifying people who can
safely be sent to a minimum security prison or a halfway house, and
those who would benefit from a particular type of psychological care." The first step to
determining an inmate's risk of recidivism, according to Diakopoulos,
begins with filling out scoresheets. He says, "The form itself, as well
as its scoring system, often discloses key features about the algorithm,
like the variables under consideration and how they come together to
form an overall risk score." However, that is not enough
according to Diakopoulos. To have algorithmic transparency more
information is needed on how the forms were designed, developed, and
evaluated. As to why this is important, he mentions, "Only then can the
public know whether the factors and calculations involved in arriving at
the score are fair and reasonable, or uninformed and biased." One of the reasons Diakopoulos decided to research criminal justice was the ability to use the
Freedom of Information Act
(FOIA) and similar state laws to get information about the forms and
any supporting documentation. Diakopoulos, his colleague Sandy Banisky,
and her media law class submitted FOIA requests in all 50 states. "We
asked for documents, mathematical descriptions, data, validation
assessments, contracts, and source code related to algorithms used in
criminal justice: such as for parole and probation, bail, or sentencing
decisions," writes Diakopoulos. Getting the information was anything but easy, even
figuring out whom to ask was difficult. To make matters worse, several
states denied the researchers' requests, explaining the algorithms are
embedded in software, therefore not subject to the FOIA statutes. Interestingly, nine states refused to disclose any information about
their criminal justice algorithms, stating the software tools were
privately owned. One example, offered by Diakopoulos, was
LSI-R, a recidivism risk questionnaire. The list of refusals continues on and on, making it painfully
apparent why Diakopoulos is concerned about transparency. So much so, he
asks, "[G]iven the government routinely contracts with private
companies, how do we balance these concerns against an explainable and
indeed legitimate justice system?" Even more to the point,
Diakopoulos mentions that the research team did not receive any
information on how the criminal justice risk-assessment forms were
developed or evaluated."
The entire story can be found at:
http://www.techrepublic.com/article/decision-making-algorithms-is-anyone-making-sure-theyre-right
PUBLISHER'S NOTE:
I have added a search box for content in this blog which now encompasses
several thousand posts. The search box is located near the bottom of
the screen just above the list of links. I am confident that this
powerful search tool provided by "Blogger" will help our readers and
myself get more out of the site.
The
Toronto Star, my previous employer for more than twenty
incredible years, has put considerable effort into exposing the
harm caused by Dr. Charles Smith and his protectors - and into
pushing for reform of Ontario's forensic pediatric pathology
system. The Star has a "topic" section which focuses on recent
stories related to Dr. Charles Smith. It can be found at:
http://www.thestar.com/topic/charlessmith
Information on "The Charles Smith Blog Award"- and its nomination process - can be found at:
http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html
Please
send any comments or information on other cases and issues of
interest to the readers of this blog to:
hlevy15@gmail.com;
Harold Levy;
Publisher: The Charles Smith Blog;