PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's Lee Rowland puts it: "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public." The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world, "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and the increasing use of artificial intelligence by countries for surveillance of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing pushback.
Harold Levy: Publisher; The Charles Smith Blog:
------------------------------------------------------------
PASSAGE OF THE DAY: "Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress," wrote ACLU attorney Jacob Snow. “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that." Facial recognition technology’s difficulty detecting darker skin tones is a well-established problem. In February, MIT Media Lab’s Joy Buolamwini and Microsoft’s Timnit Gebru published findings that facial recognition software from IBM, Microsoft, and Face++ have a much harder time identifying gender in people of color than in white people. In a June evaluation of Amazon Rekognition, Buolamwini and Inioluwa Raji of the Algorithmic Justice League found similar built-in bias. Rekognition managed to even get Oprah wrong. “Given what we know about the biased history and present of policing, the concerning performance metrics of facial analysis technology in real-world pilots, and Rekognition’s gender and skin-type accuracy differences,” Buolamwini wrote in a recent letter to Amazon CEO Jeff Bezos, “I join the chorus of dissent in calling Amazon to stop equipping law enforcement with facial analysis technology.”
STORY: "
GIST: "Amazon touts its Rekognition facial recognition system as “simple and easy to use,” encouraging customers to “detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases.” And yet, in a study released Thursday by the American Civil Liberties Union, the technology managed to confuse photos of 28 members of Congress with publicly available mug shots. Given that Amazon actively markets Rekognition to law enforcement agencies across the US, that’s simply not good enough. The ACLU study also illustrated the racial bias that plagues facial recognition today. "Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress," wrote ACLU attorney Jacob Snow. “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that." Facial recognition technology’s difficulty detecting darker skin tones is a well-established problem. In February, MIT Media Lab’s Joy Buolamwini and Microsoft’s Timnit Gebru published findings that facial recognition software from IBM, Microsoft, and Face++ have a much harder time identifying gender in people of color than in white people. In a June evaluation of Amazon Rekognition, Buolamwini and Inioluwa Raji of the Algorithmic Justice League found similar built-in bias. Rekognition managed to even get Oprah wrong. “Given what we know about the biased history and present of policing, the concerning performance metrics of facial analysis technology in real-world pilots, and Rekognition’s gender and skin-type accuracy differences,” Buolamwini wrote in a recent letter to Amazon CEO Jeff Bezos, “I join the chorus of dissent in calling Amazon to stop equipping law enforcement with facial analysis technology.”
GIST: "Amazon touts its Rekognition facial recognition system as “simple and easy to use,” encouraging customers to “detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases.” And yet, in a study released Thursday by the American Civil Liberties Union, the technology managed to confuse photos of 28 members of Congress with publicly available mug shots. Given that Amazon actively markets Rekognition to law enforcement agencies across the US, that’s simply not good enough. The ACLU study also illustrated the racial bias that plagues facial recognition today. "Nearly 40 percent of Rekognition’s false matches in our test were of people of color, even though they make up only 20 percent of Congress," wrote ACLU attorney Jacob Snow. “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that." Facial recognition technology’s difficulty detecting darker skin tones is a well-established problem. In February, MIT Media Lab’s Joy Buolamwini and Microsoft’s Timnit Gebru published findings that facial recognition software from IBM, Microsoft, and Face++ have a much harder time identifying gender in people of color than in white people. In a June evaluation of Amazon Rekognition, Buolamwini and Inioluwa Raji of the Algorithmic Justice League found similar built-in bias. Rekognition managed to even get Oprah wrong. “Given what we know about the biased history and present of policing, the concerning performance metrics of facial analysis technology in real-world pilots, and Rekognition’s gender and skin-type accuracy differences,” Buolamwini wrote in a recent letter to Amazon CEO Jeff Bezos, “I join the chorus of dissent in calling Amazon to stop equipping law enforcement with facial analysis technology.”
'We wouldn’t find this acceptable in any other setting. Why should we find it acceptable here?'
Alvaro Bedoya, Center on Privacy and Technology;
Yet
Amazon Rekognition is already in active use in Oregon’s Washington
County. And the Orlando, Florida police department recently resumed a
pilot program to test Rekognition’s efficacy, although the city says
that for now, “no images of the public will be used for any testing—only
images of Orlando police officers who have volunteered to participate
in the test pilot will be used.” Those are just the clients that are
public; Amazon declined to comment on the full scope of law
enforcement’s use of Rekognition. For privacy
advocates, though, any amount is too much, especially given the system’s
demonstrated bias. “Imagine a speed camera that wrongly said that black
drivers were speeding at higher rates than white drivers. Then imagine
that law enforcement knows about this, and everyone else knows about
this, and they just keep using it,” says Alvaro Bedoya, executive
director of Georgetown University’s Center on Privacy and Technology.
“We wouldn’t find this acceptable in any other setting. Why should we
find it acceptable here?” Amazon takes issue with
the parameters of the study, noting that the ACLU used an 80 percent
confidence threshold; that’s the likelihood that Rekognition found a
match, which you can adjust according to your desired level of accuracy.
“While 80 percent confidence is an acceptable threshold for photos of
hot dogs, chairs, animals, or other social media use cases, it wouldn’t
be appropriate for identifying individuals with a reasonable level of
certainty,” the company said in a statement. “When using facial
recognition for law enforcement activities, we guide customers to set a
threshold of at least 95 percent or higher.” While
Amazon says it works closely with its partners, it’s unclear what form
that guidance takes, or whether law enforcement follows it. Ultimately,
the onus is on the customers—including law enforcement—to make the
adjustment. An Orlando Police Department spokesperson did not know how
the city had calibrated Rekognition for its pilot program. The
ACLU counters that 80 percent is Rekognition’s default setting. And UC
Berkeley computer scientist Joshua Kroll, who independently verified the
ACLU’s findings, notes that if anything, the professionally
photographed, face-forward congressional portraits used in the study are
a softball compared to what Rekognition would encounter in the real
world. “As far as I can tell, this is the easiest
possible case for this technology to work,” Kroll says. “While we
haven’t tested it, I would naturally anticipate that it would perform
worse in the field environment, where you’re not seeing people’s faces
straight on, you might not have perfect lighting, you might have some
occlusion, maybe people are wearing things or carrying things that get
in the way of their faces.” Amazon also downplays
the potential implications of facial recognition errors. “In real world
scenarios, Amazon Rekognition is almost exclusively used to help narrow
the field and allow humans to expeditiously review and consider options
using their judgement,” the company’s statement reads. But that elides
the very real consequences that could be felt by those who are wrongly
identified. “At a minimum, those people are going
to be investigated. Point me to a person that likes to be investigated
by law enforcement,” Bedoya says. “This idea that there’s no cost to
misidentifications just defies logic.”
'What we’re trying to avoid here is mass surveillance.'
Jeramie Scott, EPIC;
So,
too, does the notion that a human backstop provides an adequate check
on the system. “Often with technology, people start to rely on it too
much, as if it’s infallible,” says Jeramie Scott, director of the
Electronic Privacy Information Center’s Domestic Surveillance Project.
In 2009, for instance, San Francisco police handcuffed a woman
and held her at gunpoint after a license-plate reader misidentified her
car. All they had to do to avoid the confrontation was to look at the
plate themselves, or notice that the make, model, and color didn’t
match. Instead, they trusted the machine.
Even if
facial recognition technology worked perfectly, putting it in the hands
of law enforcement would still raise concerns. “Facial recognition
destroys the ability to remain anonymous. It increases the ability of
law enforcement to surveil individuals not suspected of crimes. It can
chill First Amendment-protected rights and activities,” Scott says.
“What we’re trying to avoid here is mass surveillance.” While
the ACLU study covers well-trod ground in terms of facial recognition’s
faults, it may have a better chance at making real impact. “The most
powerful aspect of this is that it makes it personal for members of
Congress,” says Bedoya. Members of the Congressional Black Caucus had
previously written a letter to Amazon expressing related concerns, but the ACLU appears to have gotten the attention of several additional lawmakers. The
trick, though, will be turning that concern into action. Privacy
advocates say that at a minimum, law enforcement’s use of facial
recognition technology should be heavily restricted until its racial
bias has been corrected and its accuracy assured. And even then, they
argue, its scope needs to be limited, and clearly defined. Until that
happens, it’s time not to pump the brakes but to slam down on them with
both feet. “A technology that’s proven to vary
significantly across people based on the color of their skin is
unacceptable in 21st-century policing,” says Bedoya.
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/ charlessmith.
Information on "The Charles Smith Blog Award"- and its nomination
process - can be found at: http://smithforensic.blogspot. com/2011/05/charles-smith- blog-award-nominations.html
Please send any comments or information on other cases and issues of
interest to the readers of this blog to: hlevy15@gmail.com.
Harold Levy: Publisher; The Charles Smith Blog;
---------------------------------------------------------------------
- The entire story can be read at the link below:
- https://www.wired.com/story/amazon-facial-recognition-congress-bias-law-enforcement?mbid=nl_072818_daily_list1_p3&CNDID=23031411
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/
Harold Levy: Publisher; The Charles Smith Blog;
---------------------------------------------------------------------