STORY: "New York City moves to create accountability for algorithms," by Lauren Kirchner, published by ProPublica on December 18, 2017. (Lauren Kirchner is a senior reporting fellow at ProPublica. ProPublica is an independent, non-profit newsroom
that produces investigative journalism in the public interest.)
GIST: The algorithms that play increasingly central roles in our lives
often emanate from Silicon Valley, but the effort to hold them
accountable may have another epicenter: New York City. Last week, the
New York City Council unanimously passed a bill to tackle algorithmic
discrimination -- the first measure of its kind in the country. The
algorithmic accountability bill,
waiting to be signed into law by Mayor Bill de Blasio, establishes a
task force that will study how city agencies use algorithms to make
decisions that affect New Yorkers’ lives and whether any of the systems
appear to discriminate against people based on age, race, religion,
gender, sexual orientation or citizenship status. The task force’s
report will also explore how to make these decision-making processes
understandable to the public. The bill’s sponsor, Council Member James Vacca, said he was inspired by ProPublica’s investigation into
racially biased algorithms used to assess the criminal risk of defendants. “My ambition here is transparency, as well as accountability,” Vacca said. A previous, more sweeping version of the bill had mandated that city
agencies publish the source code of all algorithms being used for
“targeting services” or “imposing penalties upon persons or policing”
and to make them available for “self-testing” by the public. At a
hearing at
City Hall in October, representatives from the mayor’s office expressed
concerns that this mandate would threaten New Yorkers’ privacy and the
government’s cybersecurity. The bill was one of two moves the City Council made last week
concerning algorithms. On Thursday, the committees on health and public
safety held a hearing on the city’s forensic methods, including
controversial tools that the chief medical examiner’s office crime lab
has used for difficult-to-analyze samples of DNA. As a ProPublica/New York Times investigation
detailed in
September, an algorithm created by the lab for complex DNA samples has
been called into question by scientific experts and former crime lab
employees. The software, called the Forensic Statistical Tool, or FST, has never been adopted by any other lab in the country. Council Member Corey Johnson, chair of the health committee, quoted
two key findings of our investigation: that FST’s inventors had
acknowledged a margin of error of 30 percent for one key input of the
program, and that the program could not take into consideration that
family members might share DNA. New York City no longer uses the tool for new cases. But officials at
the hearing said they saw no need to revisit the thousands of criminal
cases that relied on the technique in years past. “Would you be open to reviewing cases in which testing was done on
very small mixtures, or do you feel totally confident in all of the
methods and science that were used on every case that’s come through
your lab?” Johnson asked the officials. “We are totally confident,” answered Dr. Barbara Sampson, the city’s chief medical examiner. The algorithm’s source code was a closely held secret for years until
a federal judge granted a motion filed by ProPublica to lift a
protective order on it in October. We then
published the code. Defense attorneys testified at the hearing, criticizing the medical
examiner’s office for what they saw as a dangerous lack of transparency
in the development of its DNA tools. Some had joined together to write to the state’s inspector general in
September, demanding an investigation into the lab and a review of past
cases. The inspector general, Catherine Leahy Scott, has not yet
indicated whether she will pursue it. Meanwhile, the New York State
Commission on Forensic Science, which oversees the use of forensic
methods in the state’s labs, has discussed the criticisms in executive
session meetings. Those sessions are closed to the public, and
commission members are prohibited from speaking about them. After the hearing, Johnson said he was concerned by the discrepancies
between the medical examiner’s testimony and that of advocates and
intended to explore it further. “This is a very, very, very important issue, and we have to ensure
that methods that are used are scientifically sound, validated in
appropriate ways, transparent to the public and to defense counsels, and
ensure greater trust in the justice system,” said Johnson. “And I think
that is what, hopefully, we can achieve, through asking more questions
-- and potentially thinking about legislation in the future.”"
The entire story can be found at:
https://gcn.com/articles/2017/12/18/nyc-bill-algorithmic-discrimination.aspx?m=1
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the
Charles Smith Blog for reports on developments. The Toronto Star, my
previous employer for more than twenty incredible years, has put
considerable effort into exposing the harm caused by Dr. Charles Smith
and his protectors - and into pushing for reform of Ontario's forensic
pediatric pathology system. The Star has a "topic" section which focuses
on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please
send any comments or information on other cases and issues of interest
to the readers of this blog to: hlevy15@gmail.com. Harold Levy;
Publisher; The Charles Smith Blog.