PUBLISHER'S NOTE: Technology series: This Blog has been increasingly drawn into the world of technology in terms of its ability to cause wrongful prosecutions, to provide the police with excessive powers, and to impact on individual privacy and dignity. The series will also make clear that one should not evaluate technology such as Artificial intelligence - and the logarithms it is based on - solely within the confines of the criminal justice system. Artificial intelligence is quietly and quickly spreading into many aspects of our lives. We must be aware of the of the total impact on us as individuals and on our society.
Harold Levy: Publisher; The Charles Smith Blog.
-----------------------------------------------------------
PASSAGE OF THE DAY: "Amazon publicly introduced Rekognition in November 2016. Marketers could use the image recognition software, developed by the company’s scientists to analyze billions of images and videos daily, to recognize celebrities in their videos, while owners of dating apps could use the program to identify unwanted suggestive or explicit content, according to the company’s website. But the vast expansion of facial recognition and other image-scanning technology has been highly controversial and given rise to privacy concerns. Facial recognition can allow strangers to identify people who don’t wish to be identified, such as shoppers in stores, individuals in a crowd, or people who appear in photos that get posted on social media.Privacy advocates point out that in an era in which everyone has a camera on their smartphone, cities have put cameras on traffic stops and other public infrastructure, and police are wearing body cameras, the opportunity to have one’s photograph taken, identified, analyzed, and stored in perpetuity has grown immensely. “Once powerful surveillance systems like these are built and deployed, the harm can’t be undone. We’re talking about a technology that will supercharge surveillance in our communities,” said Nicole Ozer, Technology and Civil Liberties Director for the ACLU of Northern California. She said the technology could be used “to track protesters, target immigrants, and spy on entire neighborhoods."
STORY: "Amazon is selling facial recognition to law enforcement — for a fistful of dollars," by Ekizabet Dwoskin, published by The Washington Post on May 22, 2018. (Elizabeth Dwoskin has been reporting from Silicon Valley since 2013. She was the Wall Street Journal's first full-time beat reporter covering big data and artificial intelligence. In 2016, she joined The Washington Post as Silicon Valley correspondent, becoming the paper's eyes and ears in the region and in the wider world of tech.)
GIST: "Amazon has been essentially giving away facial recognition tools to law enforcement agencies in Oregon and Orlando, according to documents obtained by American Civil Liberties Union of Northern California, paving the way for a rollout of technology that is causing concern among civil rights groups. Amazon is providing the technology, known as Rekognition, as well as consulting services, according to the documents, which the ACLU obtained through a Freedom of Information Act request. A coalition of civil rights groups, in a letter released Tuesday, called on Amazon to stop selling the program to law enforcement because it could lead to the expansion of surveillance of vulnerable communities. “We demand that Amazon stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country,” the groups wrote in the letter. Amazon spokeswoman Nina Lindsey did not directly address the concerns of civil rights groups. “Amazon requires that customers comply with the law and be responsible when they use AWS services,” she said, referring to Amazon Web Services, the company’s cloud software division that houses the facial recognition program. “When we find that AWS services are being abused by a customer, we suspend that customer’s right to use our services.” She said that the technology has many useful purposes, including finding abducted people. Amusement parks have used it to locate lost children. During the royal wedding this past weekend, clients used Rekognition to identify wedding attendees, she said. (Amazon founder Jeffrey P. Bezos is the owner of The Washington Post.) The details about Amazon’s program illustrate the proliferation of cutting-edge technologies deep into American society — often without public vetting or debate. Axon, the maker of Taser electroshock weapons and the wearable body cameras for police, has voiced interest in pursuing face recognition for its body-worn cameras, prompting a similar backlash from civil rights groups. Hundreds of Google employees protested last month to demand that the company stop providing artificial intelligence to the Pentagon to help analyze drone footage.
The
technology works through pattern recognition: Customers put known images
– of child pornography or of celebrities, for example – into a
database, and the software uses artificial intelligence to scan new
images for a match with those already stored. The more images that are
fed into the system, the more accurate the software becomes. Amazon publicly
introduced Rekognition in November 2016. Marketers could use the image
recognition software, developed by the company’s scientists to analyze
billions of images and videos daily, to recognize celebrities in their
videos, while owners of dating apps could use the program to identify
unwanted suggestive or explicit content, according to the company’s website. But
the vast expansion of facial recognition and other image-scanning
technology has been highly controversial and given rise to privacy
concerns. Facial recognition can allow strangers to identify people who
don’t wish to be identified, such as shoppers in stores, individuals in a
crowd, or people who appear in photos that get posted on social media.
Privacy
advocates point out that in an era in which everyone has a camera on
their smartphone, cities have put cameras on traffic stops and other
public infrastructure, and police are wearing body cameras, the
opportunity to have one’s photograph taken, identified, analyzed, and
stored in perpetuity has grown immensely. “Once
powerful surveillance systems like these are built and deployed, the
harm can’t be undone. We’re talking about a technology that will
supercharge surveillance in our communities,” said Nicole Ozer,
Technology and Civil Liberties Director for the ACLU of Northern
California. She said the technology could be used “to track protesters,
target immigrants, and spy on entire neighborhoods. The
documents provide a detailed look at how Amazon is marketing
Rekognition. It can identify up to 100 people in a crowd, the documents
said. The sheriff’s office of Washington
County, Ore., built a database of 300,000 mug shots of suspected
criminals that officers could have Rekognition scan against footage of
potential suspects in real-time. The footage could come from police body
cameras and public and private cameras. The county pays Amazon between
$6 and $12 a month for the service, a county spokesman said. According
to the documents, Amazon asked the county to tout its experience with
Rekognition to other public sector customers, including a manufacturer
of body cameras. Deputy
Jeff Talbot, public information officer for the Washington County
Sheriff’s Office, said the program was not operating in the shadows and
had been the subject of several news local stories. He pointed out that
jail booking photos are already public and that the software simply
allows officers to scan them instantaneously and in real-time, and
compare them against footage of actual suspects, which is a valuable
contribution to public safety. “Our goal is to inform the public about
the work we’re doing to solve crimes. It is not mass surveillance or
untargeted surveillance.” He could not say how
many crimes the program had helped solve and added that the software
wasn’t always accurate. But he said officers were trained not to rely
exclusively on the software to make decisions, and it was just an
additional tool in the officer’s tool kit. For the cheap price Amazon
was offering, he said it made sense to test out the service. Zahra
Billoo, executive director of the Council on American-Islamic Relations
San Francisco Bay area office, one of the groups that signed the
letter, said many people who are booked into jail are not always charged
with a crime or are proved innocent. She said that she worried that
people’s civil rights are violated when law enforcement keeps their
images in a database even after they are proved innocent or were never
charged. She said Amazon was contributing to these violations by making
it easier to scan people’s faces, repeatedly subjecting them to
surveillance. In addition to the ACLU, the
coalition of about 40 groups included Color of Change, Human Rights
Watch, Muslim Advocates and the Electronic Frontier Foundation. Amazon
is one of many companies selling artificial intelligence tools such as
facial recognition and image-scanning to business clients. Microsoft
offers a rival service, called Facial Recognition API. A
crop of start-ups market the ability to scan the emotions on people’s
faces as they walk in and out of stores. Such technology has been touted
as a way to prevent shoplifting."
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/c harlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot. com/2011/05/charles-smith-blog -award-nominations.html Please
send any comments or information on other cases and issues of interest
to the readers of this blog to: hlevy15@gmail.com. Harold Levy;
Publisher; The Charles Smith Blog.
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/c
The entire story can be read at the link below: