PASSAGE OF THE DAY: "In a June meeting
with Immigration and Customs Enforcement (ICE), Amazon Web Services
pitched the tech as part of a system of mass surveillance that could
identify and track unauthorized immigrants, their families, and their
friends, according to records obtained by the Project on Government
Oversight. Once ICE develops the infrastructure for video surveillance
and real-time biometric monitoring, other agencies, such as the FBI, the
Drug Enforcement Administration, and local police, will no doubt argue
that they should be able to access mass surveillance technologies too. Amazon boasts the tool is already helping with everything
from minimizing package theft to tracking down sex traffickers, and the
company points to its terms of use, which prohibit illegal violations of
privacy, to assuage fears."
STORY: Can Algorithms Run Things Better Than Humans? Welcome to
the rise of the Alogocracy, by Ronald Bailey, published by Reason in
the January 2019 issue. (Ronald Bailey is science correspondent for Reason magazine and the author, most recently, of The End of Doom (2015).
GIST: Police in Orlando, Florida, are using a powerful new tool to identify and track folks in real time. Video streams from four cameras located at police headquarters, three in the city's downtown area, and one outside of a recreation center will be processed through Amazon's Rekognition technology, which has been developed through deep learning algorithms trained using millions of images to identify and sort faces. The tool is astoundingly cheap: Orlando Police spent only $30.99 to process 30,989 images, according to the American Civil Liberties Union (ACLU). For now the test involves only police officers who have volunteered for the trial. But the company has big plans for the program. In a June meeting with Immigration and Customs Enforcement (ICE), Amazon Web Services pitched the tech as part of a system of mass surveillance that could identify and track unauthorized immigrants, their families, and their friends, according to records obtained by the Project on Government Oversight. Once ICE develops the infrastructure for video surveillance and real-time biometric monitoring, other agencies, such as the FBI, the Drug Enforcement Administration, and local police, will no doubt argue that they should be able to access mass surveillance technologies too. Amazon boasts the tool is already helping with everything from minimizing package theft to tracking down sex traffickers, and the company points to its terms of use, which prohibit illegal violations of privacy, to assuage fears. As impressive as Rekognition is, it's not perfect. The same ACLU report found that a test of the technology erroneously matched 28 members of Congress with criminal mugshots. Being falsely identified as a suspect by facial recognition technology, prompting police to detain you on your stroll down a street while minding your own business, would annoy anybody. Being mistakenly identified as a felon who may be armed would put you in danger of aggressive, perhaps fatal, police intervention. Are you willing to trust your life and liberty to emerging algorithmic governance technologies such as Rekognition? The activities and motives of a police officer or bureaucrat can be scrutinized and understood by citizens. But decisions made by ever-more-complex algorithms trained on vast data sets likely will become increasingly opaque and thus insulated from public oversight. Even if the outcomes seem fair and beneficial, will people really accept important decisions about their lives being made this way—and, as important, should they? (These are just a few paragraphs of a lengthy article. The entire article (link below) is well worth the read. HL);
The entire story can be read at:
GIST: Police in Orlando, Florida, are using a powerful new tool to identify and track folks in real time. Video streams from four cameras located at police headquarters, three in the city's downtown area, and one outside of a recreation center will be processed through Amazon's Rekognition technology, which has been developed through deep learning algorithms trained using millions of images to identify and sort faces. The tool is astoundingly cheap: Orlando Police spent only $30.99 to process 30,989 images, according to the American Civil Liberties Union (ACLU). For now the test involves only police officers who have volunteered for the trial. But the company has big plans for the program. In a June meeting with Immigration and Customs Enforcement (ICE), Amazon Web Services pitched the tech as part of a system of mass surveillance that could identify and track unauthorized immigrants, their families, and their friends, according to records obtained by the Project on Government Oversight. Once ICE develops the infrastructure for video surveillance and real-time biometric monitoring, other agencies, such as the FBI, the Drug Enforcement Administration, and local police, will no doubt argue that they should be able to access mass surveillance technologies too. Amazon boasts the tool is already helping with everything from minimizing package theft to tracking down sex traffickers, and the company points to its terms of use, which prohibit illegal violations of privacy, to assuage fears. As impressive as Rekognition is, it's not perfect. The same ACLU report found that a test of the technology erroneously matched 28 members of Congress with criminal mugshots. Being falsely identified as a suspect by facial recognition technology, prompting police to detain you on your stroll down a street while minding your own business, would annoy anybody. Being mistakenly identified as a felon who may be armed would put you in danger of aggressive, perhaps fatal, police intervention. Are you willing to trust your life and liberty to emerging algorithmic governance technologies such as Rekognition? The activities and motives of a police officer or bureaucrat can be scrutinized and understood by citizens. But decisions made by ever-more-complex algorithms trained on vast data sets likely will become increasingly opaque and thus insulated from public oversight. Even if the outcomes seem fair and beneficial, will people really accept important decisions about their lives being made this way—and, as important, should they? (These are just a few paragraphs of a lengthy article. The entire article (link below) is well worth the read. HL);
The entire story can be read at:
https://reason.com/archives/ 2018/12/16/can-algorithms-run- things-bett
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/ charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot. com/2011/05/charles-smith- blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy: Publisher; The Charles Smith Blog;
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/