Sunday, March 1, 2020

Technology: Super power facial recognition tool: From my own backyard: Toronto police used it for months - before Chief Mark Saunders became aware of its use and ordered it stopped," the Toronto Star (Reporter Kate Allen) reports. (But The Ontario Provincial Police and the RCMP in their utter states of transparency have refused to comment. Kinda makes one wonder! HL)..." "Toronto police officers used a controversial facial recognition technology for months, according to a spokesperson, before chief Mark Saunders became aware of its use and ordered it stopped. Clearview AI, a U.S. company that provides artificial intelligence-powered facial recognition tools to law enforcement agencies, has been called “reckless,” “invasive,” and “dystopian” by critics. It identifies people by scanning for matches in its database of billions of images scraped from the open web, including social media sites, providing vastly greater search powers than other known facial recognition tools." (Of interest to this Blog - not just because of the privacy issues - but because of the risk of misdentification on a large scale, and the harmful consequences that might be posed to innocent people. HL.)to innocent individuals

 

QUOTE OF THE DAY: "This company allegedly has developed its entire facial recognition system by illegitimately if not illegally scraping images from the public internet,” says Chris Parsons,a senior research associate at the University of Toronto’s Citizen Lab. If Toronto police or any other Canadian law enforcement agency did that directly, “it would be radically afoul of Canadian privacy legislation. Using services produced by companies predicated on violations of Canadian law seems like an inappropriate technology to adopt,” says Parsons."

 ----------------------------------------------------------------

PASSAGE OF THE DAY: "According to The New York Times, more than 600 law enforcement agencies use Clearview AI. Earlier this week, the Ontario Provincial Police and the Mounties both declined to answer the Star’s questions about whether they used Clearview AI. “The OPP has used facial recognition technology for various types of investigations,” OPP spokesperson Carolle Dionne said. “As its use is operational and specific to investigative technique we will not specify further.” “Generally, the RCMP does not comment on specific investigative tools or techniques,” said RCMP spokesperson Catherine Fortin. “However, we continue to monitor new and evolving technology.” In the last month, YouTube, Facebook, Twitter, and LinkedIn have all demanded that the company stop using data scraped from their websites, according to media reports."

--------------------------------------------------------------------

STORY: "Toronto police chief halts use of controversial facial recognition tool," by Reporter Karen Allen, published on February 13, 2020. Kate Allen: is a Toronto-based reporter covering science and technology.

GIST: "Toronto police officers used a controversial facial recognition technology for months, according to a spokesperson, before chief Mark Saunders became aware of its use and ordered it stopped. Clearview AI, a U.S. company that provides artificial intelligence-powered facial recognition tools to law enforcement agencies, has been called “reckless,” “invasive,” and “dystopian” by critics. It identifies people by scanning for matches in its database of billions of images scraped from the open web, including social media sites, providing vastly greater search powers than other known facial recognition tools. “Some members of the Toronto Police Service began using Clearview AI in October 2019 with the intent of informally testing this new and evolving technology,” said TPS spokesperson Meaghan Gray. “The chief directed that its use be halted immediately upon his awareness, and the order to cease using the product was given on Feb. 5, 2020.” Gray said the Toronto Police Service has requested that Ontario’s Information and Privacy Commissioner and the Crown Attorney’s Office work with the force to review the technology’s appropriateness as a tool for law enforcement, “given that it is also used by other law enforcement agencies in North America.” “Until a fulsome review of the product is completed, it will not be used by the Toronto Police Service,” she said Thursday. A front-page New York Times story published in January first drew scrutiny to the previously little-known company’s broad powers and impacts on privacy. The report detailed how the company claims to have a database of over 3 billion images scraped from Facebook, YouTube and millions of other websites. Law enforcement officials who use Clearview AI can run an image of a person against this massive database, pulling up matches collected from across the web. People who have asked to try the technology on themselves pulled up images they didn’t know were online or had never seen before. Last May, when the Star first revealed that Toronto police were using facial recognition technology, the force said their tool only searched for matches in its own internal database of lawfully acquired mugshots. At the time, Staff Insp. Stephen Harris of Forensic Identification Services said “there are no plans to expand the TPS’s use of facial recognition beyond our current mugshot database. We are not judicially authorized to do so.”
Toronto police did not respond to further questions Thursday, including whether officers were judicially authorized to use Clearview AI, whether it had been used in investigations or arrests, and how chief Saunders was not aware of its use. Brenda McPhail, director of the privacy, technology, and surveillance project at the Canadian Civil Liberties Association, called Toronto police’s use of Clearview AI “a remarkable violation of public trust.” “Clearview AI collects images of people without consent, in violation of the terms of service of the platforms people trust to protect their information — arguably, illegally — and no police force in Canada should be using technology whose lawfulness is open to question,” McPhail said." “This company allegedly has developed its entire facial recognition system by illegitimately if not illegally scraping images from the public internet,” says Chris Parsons,a senior research associate at the University of Toronto’s Citizen Lab. If Toronto police or any other Canadian law enforcement agency did that directly, “it would be radically afoul of Canadian privacy legislation. Using services produced by companies predicated on violations of Canadian law seems like an inappropriate technology to adopt,” says Parsons. According to The New York Times, more than 600 law enforcement agencies use Clearview AI. Earlier this week, the Ontario Provincial Police and the Mounties both declined to answer the Star’s questions about whether they used Clearview AI. “The OPP has used facial recognition technology for various types of investigations,” OPP spokesperson Carolle Dionne said. “As its use is operational and specific to investigative technique we will not specify further.” “Generally, the RCMP does not comment on specific investigative tools or techniques,” said RCMP spokesperson Catherine Fortin. “However, we continue to monitor new and evolving technology.” In the last month, YouTube, Facebook, Twitter, and LinkedIn have all demanded that the company stop using data scraped from their websites, according to media reports."

The entire story can be read at:
https://www.thestar.com/news/gta/2020/02/13/toronto-police-used-clearview-ai-an-incredibly-controversial-facial-recognition-tool.html
 
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog;
-----------------------------------------------------------------
FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices.""
Lawyer Radha Natarajan:
Executive Director: New England Innocence Project;