Saturday, August 11, 2018

Technology Series (Part 4): Pushback: "Civil Rights Groups Call for Reforms on Use of Algorithms to Determine Bail Risk," 'gt' (Government Technology ) reports. (Reporter Dawn Kawamoto)..."Civil rights groups that include the American Civil Liberties Union (ACLU) and the National Association for the Advancement of Colored People (NAACP) Legal Defense and Educational Fund outlined their concerns July 30 as more state and local governments lean on the technology to determine pretrial flight and criminal recidivism risks. The hope is that systematic changes will remove the race, gender and socio-economic biases from the pretrial hearings when bail is set, and ultimately lead to reductions in pretrial detention, jail overcrowding and costs. The civil rights groups warn the technology is only as good as the data that is entered, and that creates a problem."


algorithm

PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's  Lee Rowland puts it:  "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public."  The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be  exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world,  "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and  the increasing use of artificial intelligence by countries for surveillance  of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and  criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing  pushback.
Harold Levy: Publisher; The Charles Smith Blog:

------------------------------------------------------------

STORY:"Civil Rights Groups Call for Reforms on Use of Algorithms to Determine Bail Risk," by Dawn Kawamoto, published by 'gt' (Government Technology)  on August 2, 2018. ( Dawn Kawamoto is a technology and business journalist, whose work has appeared in CNET's News.com, Dark Reading, TheStreet.com, AOL's DailyFinance and The Motley Fool.)

SUB-HEADING:  Over 100 national, state and local organizations from the ACLU to the NAACP Legal Defense and Educational Fund call for reforming the use of algorithms in risk-based bail assessments.

GIST: "A big coalition of civil rights groups against most pretrial incarceration — and the money bail system that it revolves around — are calling for jurisdictions across the country to drastically change how they use risk assessment algorithms in determining which people to lock up. Civil rights groups that include the American Civil Liberties Union (ACLU) and the National Association for the Advancement of Colored People (NAACP) Legal Defense and Educational Fund outlined their concerns July 30 as more state and local governments lean on the technology to determine pretrial flight and criminal recidivism risks.  The hope is that systematic changes will remove the race, gender and socio-economic biases from the pretrial hearings when bail is set, and ultimately lead to reductions in pretrial detention, jail overcrowding and costs. The civil rights groups warn the technology is only as good as the data that is entered, and that creates a problem. Since a high number of arrests involve people of color, the data that is inputted has the increased potential to be tainted with biases, said Vanita Gupta, president and CEO of The Leadership Conference on Civil and Human Rights, during a press conference call. For example, in 2005, African-Americans comprised 14 percent of drug users, but 33.9 percent were arrested for a drug offense, according to a report from the American Bar Association. As a result, the groups argue, a disproportionate number of minority individuals could be forced to pay unreasonable bail amounts. "Someone who is accused of a crime should not be locked up just because they cannot afford to pay bail," said Monique Dixon, deputy policy director of the NAACP Legal Defense and Educational Fund, during the coalition's press conference.  In absence of an outright ban, the 115 organizations suggest these algorithms need to be retooled and held to strict use policies. They include:

  • Develop risk assessment algorithm tools that reduce racial disparities in the justice system.
  • Pretrial assessment tools should never recommend detention. If release is not recommended, it should recommend a pretrial release hearing, while adhering to safeguards.
  • Detention ahead of a trial and conditions of supervision must also be avoided, except through an "individualized, adversarial hearing."
  • Tools need transparency and independent validation, and should be open to challenge by the defendant's legal counsel.
  • The tool should clearly communicate the likelihood of success, not failure, upon release.
  • Tools should be developed with community input, which is revalidated by data scientists.

Currently, some risk assessment tool providers do not allow access to proprietary software for inspection, while others, such as nonprofit Laura and John Arnold Foundation's (LJAF) Public Safety Assessment, allow it. The Arnold Foundation disagrees with some of the coalition's characterizations.
"(The) description of risk assessments as tools that 'can defer the responsibility of determining who to detain pretrial and who to release' misconstrues the role of risk assessments," according to a statement from the foundation. "Risk assessments, such as the Public Safety Assessment (PSA) developed by LJAF, do not make pretrial release decisions or replace a judge's discretion. They provide judges and other court officers with information they can choose to consider — or not — when making release decisions." The foundation noted it believes early research of the technology shows it can help reduce pretrial detention, address racial disparities and increase public safety. Demand for Risk Assessment Algorithms: COMPAS, a risk assessment tool from the company Equivant which was introduced in 1998, currently operates in 35 states. And LJAF's PSA, launched in 2013, is currently used in 18 states, including a statewide presence in Kentucky, Arizona and New Jersey. "In 2015, we received approximately 100 inquiries to learn more about the PSA. In 2016, we received approximately 200 inquiries. In 2017, we received approximately 250 inquiries," David Hebert, LJAF spokesman said. "The most common issues are questions of research, readiness and implementation, which we have addressed in our new website resources." In California, San Francisco, Santa Cruz County and Tulare County have adopted PSA. And the Golden State also currently has Senate Bill 10 working its way through the Legislature, which calls for allowing local governments to use risk assessment tools when setting bail. The bill is in the negotiations stage, so it has yet to be seen if it will ultimately include the use of algorithms.  The bill, which is currently in the Assembly Appropriations Committee, states in its most recent version that the tool is required to be "equally accurate across all racial groups, ethnic groups and genders. The validation study shall include testing for predictive bias and disparate results by race, ethnicity and gender. The tool shall be adjusted to ensure accuracy and to minimize disparate results." The deadline for passing the bill in the Appropriations Committee is Aug. 17. Three of the co-sponsors of SB 10 — the California ACLU, the Ella Baker Center for Human Rights, and Silicon Valley De-Bug — were also among the coalition of 115 that signed the statement. "Several of the cosigners of [Monday's] statement are sponsors of SB 10 and have been dedicated partners in our bail reform efforts," Sen. Robert Hertzberg, who co-authored SB 10, told Government Technology. "Our negotiations are ongoing, and we rely on the input of stakeholders to ensure that when we enact bail reform, it works for all Californians." Natasha Minsker, director of the ACLU of CA Center for Advocacy and Policy, agrees. "Yesterday, the national ACLU signed onto a statement along with other civil rights, digital justice and community-based organizations, which included a call for important policy reforms to accompany any use of pretrial risk assessment tools. SB 10 is consistent with this call and includes the policy reforms identified," Minsker said. "California urgently needs bail reform. We need to replace the current system with one that prioritizes justice and public safety, not industry profits. We remain committed to passing SB 10 and making 2018 the year of bail reform in California." Meanwhile, the 115 groups in the coalition plan to disseminate their statement paper among their members, and share it with prosecutors and decision-makers, Scott Roberts, senior campaign director for Color of Change, said in the press conference. "We will work with our grassroot partners and build out our teams," he said."

The entire commentary can  be read at the link below:



  • Programming code abstract screen of software developer.
 PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. 

Harold Levy: Publisher; The Charles Smith Blog;

---------------------------------------------------------------------