Saturday, August 18, 2018

Technology Series: (Part 11); New York Creates Task Force to Examine Automated Decision MakingTechnology Series..."In December 2017, the New York City Council passed the country's first bill to demand accountability in how algorithms are used in city government. That bill mandated that a task force study how city agencies use algorithms. This task force will report on how to make these algorithms understandable to the public. The original proposal by Council Member James Vacca mandated that the source code for the algorithms be made public. Some policy experts warned that openness might create security risks, or provide a way for people to game the public benefits system. Technology companies argued that they might be required to disclose proprietary information. The disclosure requirement was dropped in favor of the task force."


algorithm

PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's  Lee Rowland puts it:  "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public."  The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be  exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world,  "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and  the increasing use of artificial intelligence by countries for surveillance  of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and  criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing  pushback.

Harold Levy: Publisher; The Charles Smith Blog:

------------------------------------------------------------

PASSAGE OF THE DAY: "The investigation of bias and infringement on rights in algorithmic decision making is only beginning. Predictive policing programs in Chicago and New Orleans are being scrutinized for violations of due process and privacy. The public is often unaware of the use of these tools. Even the creators of algorithms often cannot fully explain how the software came to the conclusion that was reached. Several city agencies are starting to use decision systems. The Fire Department uses the Risk-Based Inspection System (RBIS) to predict where fires might start. Part of the RBIS is the Fire Cast tool that uses data from five city agencies to analyze 60 risk factors to predict which buildings are most vulnerable to fire outbreaks. These buildings are then prioritized for inspections, the data being available to all the city's 49 fire companies. The Police Department uses algorithms for the data obtained from body cameras and facial recognition. Algorithms are also used by the Department of Transportation, the Mayor's Office of Criminal Justice, the Department of Education, and the Department of Social Services. Students are matched with schools. Teacher performance is assessed. Medicare fraud is investigated."

--------------------------------------------------------------------

STORY: "New York Creates Task Force to Examine Automated Decision Making," by reporter Michael Stiefel, published by InfoQ on Juky 31, 2018.

GIST: "New York City has created an Automated Decision Systems Task Force to demand accountability and transparency in how algorithms are used in city government. The final report of the task force is due in December 2019. This task force is the first in the United States to study this issue. Background; In December 2017, the New York City Council passed the country's first bill to demand accountability in how algorithms are used in city government. That bill mandated that a task force study how city agencies use algorithms. This task force will report on how to make these algorithms understandable to the public. The original proposal by Council Member James Vacca mandated that the source code for the algorithms be made public. Some policy experts warned that openness might create security risks, or provide a way for people to game the public benefits system. Technology companies argued that they might be required to disclose proprietary information. The disclosure requirement was dropped in favor of the task force. What the Law States: An automated decision system is defined as "a computerized implementations of algorithms, including those derived from machine learning or other data processing or artificial intelligence techniques, which are used to make or assist in making decisions." The law requires the task force to accomplish at least six goals in their final report. They need to identify which city agencies should be subject to review. They need to recommend procedures so that people affected by an algorithmic decision can request an explanation upon what the decision was based, as well as how adverse impacts can be addressed. They also should explain the development and implementation of a procedure in which the city may determine if an automated decision system used by a city agency "disproportionately impacts persons based upon age, race, creed, color, religion, national origin, gender, disability, marital status, partnership status, caregiver status, sexual orientation, alienage or citizenship status". Recommendations for processes for making information available for automated decision systems will allow the public to meaningfully assess how they work, and are used by the city, as well as the feasibility of archiving automated decisions and the data used. The members of this task force are not limited to experts in algorithmic design and implementation, but can include people who understand the impact of algorithms on society. Meeting participants can be limited if it "would violate local, state or federal law, interfere with a law enforcement investigation or operations, compromise public health or safety, or result in the disclosure of proprietary information." While the final report should be publicly available, no recommendation is required if it "would violate local, state, or federal law, interfere with a law enforcement investigation or operations, compromise public health or safety, or would result in the disclosure of proprietary information." This task force has no legal authority to force, or penalize city agencies that do not comply with their recommendations. Background to the Controversy; The investigation of bias and infringement on rights in algorithmic decision making is only beginning. Predictive policing programs in Chicago and New Orleans are being scrutinized for violations of due process and privacy. The public is often unaware of the use of these tools. Even the creators of algorithms often cannot fully explain how the software came to the conclusion that was reached. Several city agencies are starting to use decision systems. The Fire Department uses the Risk-Based Inspection System (RBIS) to predict where fires might start. Part of the RBIS is the Fire Cast tool that uses data from five city agencies to analyze 60 risk factors to predict which buildings are most vulnerable to fire outbreaks. These buildings are then prioritized for inspections, the data being available to all the city's 49 fire companies. The Police Department uses algorithms for the data obtained from body cameras and facial recognition. Algorithms are also used by the Department of Transportation, the Mayor's Office of Criminal Justice, the Department of Education, and the Department of Social Services. Students are matched with schools. Teacher performance is assessed. Medicare fraud is investigated; Problems with the Current Legislation; Julia Powels, a research fellow at NYU’s Information Law Institute as well as at Cornell Tech, described two problems with the task force's mission which resulted from a compromise between the original legislation and what passed. First, if the agencies and contractors do not cooperate, good recommendations will not be made. There is no easily accessible information on how much the City of New York spends on algorithmic services, or how much of the data used is shared with outside contractors.  The Mayor's office rejected any requirement for mandated reporting to be in the legislation based on the argument that it would reveal proprietary information. If too much leeway is given to claims of corporate secrecy, there will be no algorithmic transparency. The other problem with the current law is that it is unclear how the city can change the behavior of companies that create automated-decision making systems. Frank Pasquale, a law professor at the University of Maryland, argues that the city has more leverage than the vendors. Members of the Task Force; The members of this task force are not limited to experts in algorithmic design and implementation, but can include people who understand the impact of algorithms on society.  It will be composed of individuals from city agencies, academia, law, industry experts, and nonprofits and think tanks. It is expected that representatives will be chosen from the Department of Social Services, the Police Department, the Department of Transportation, the Mayor’s Office of Criminal Justice, the Administration for Children’s Services, and the Department of Education. The task force is co-chaired by Emily W. Newman, acting director of the Mayor’s Office of Operations, and Brittny Saunders, deputy commissioner for strategic initiatives at the Commission on Human Rights.   Impact: New York City could have an impact with algorithms similar to California with auto emission standards. Being one of the largest cities in the world, it may make wide enough use of algorithms such that it might be easier to meet whatever standards it creates in all jurisdictions.  Altering algorithms for different locations, however, might be easier with software than mechanical devices. This is illustrated by the ability of software to calculate different sales tax regulations in different states, cities, towns, counties, etc. through out the United States. On the other hand, New York is one of the most valuable sources of demographic data in the world. Restricting the use here might encourage other locations to do the same. In any case, the argument over the fairness of algorithmic decisions, and the need to use them, is not going away."

The entire story can be read at:

Programming code abstract screen of software developer.

 PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 

Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------