PUBLISHER'S
NOTE: Artificial intelligence, once the stuff of science fiction, has
become all to real in our modern society - especially in the American
criminal justice system; As the ACLU's Lee Rowland puts it: "Today,
artificial intelligence. It's everywhere — in our homes, in our cars,
our offices, and of course online. So maybe it should come as no
surprise that government decisions are also being outsourced to computer
code. In one Pennsylvania county, for example, child and family
services uses digital tools to assess the likelihood that a child is at
risk of abuse. Los Angeles contracts with the data giant Palantir to
engage in predictive policing, in which algorithms identify residents
who might commit future crimes. Local police departments are buying
Amazon's facial recognition tool,
which can automatically identify people as they go about their lives in
public." The algorithm is finding its place deeper and deeper in the
nation's courtrooms on what used to be exclusive decisions of judges
such as bail and even the sentence to be imposed. I am pleased to see
that a dialogue has begun on the effect that increasing use of these
logarithms in our criminal justice systems is having on our society and
on the quality of decision-making inside courtrooms. As Lee Rowland asks
about this brave new world, "What does all this mean for our civil
liberties and how do we exercise oversight of an algorithm?" In view of
the importance of these issues - and the increasing use of artificial
intelligence by countries for surveillance of their citizens - it's
time for yet another technology series on The Charles Smith Blog
focusing on the impact of science on society and criminal justice. Up
to now I have been identifying the appearance of these technologies. Now
at last I can report on the realization that some of them may be
two-edged swords - and on growing pushback.
Harold Levy: Publisher; The Charles Smith Blog:
------------------------------------------------------------
PASSAGE OF THE DAY: "Predpol uses historical data from both property and violent crime
reports to identify which city blocks are most likely to be the site of
crimes. LASER plugs data from field interview cards collected by
officers and
information from arrest reports into a program called Palantir. The
program then identifies “hotspots” where crime is likely to occur.
Palantir mines government and private company databases to build
extensive profiles of individuals then creates a list of people police
have identified as “persons of interest.” The list is not available to
the public."