PUBLISHER'S NOTE ONE: MEA CULPA: Oooops. I always suspected it would happen. But not so quickly. Not so soon. Sitting at my breakfast. Dipping into my favourite newspaper the Toronto Star, my home for many years - over a cup of Maxwell instant. (Forgot to re-order the Nespresso); And there it is - on page one, above the fold. Big black, bold heading "Toronto cops using facial recognition technology: Police say it helps ID suspects more efficiently, but critics are wary of how tool can be used." For shame, Levy. This is happening in your own backyard. And you didn't warn your readers! Well, in my defence. I had no idea. This didn't start with a study, a report to the police board, or maybe consultations with The Canadian Civil Liberties Association and other community groups. Nada. Just a story in the Toronto Star. (Did I say it was my favourite newspaper? Long after the horse is out of the barn. (Just like Torontonians only discovered that the Toronto force had moved to sleek, furtive, super-hero, futuristic police cars - the kind that you don't notice until it's too late and you are pulled over - when more and more people were pulled over by them!) Sorry the chief said sheepishly. Maybe I should have told you. (Now they are all over the place).And now this. Levy. You wrote about the the hidden expansion of facial recognition and other applications - without consideration of effectiveness, risk and invasions of privacy in China, India, North America, South America, The Phillipines, and elsewhere, while governments sit by and do nothing, except bemoan the fact, and schedule legislative committee meetings where the legislators can grandstand and display their ignorance, doing nothing as the technologies spread, unexamined, unchecked, like viruses. For shame Levy. You have to do better. End of rant... Read the Story.
Harold Levy: Publisher: The Charles Smith Blog.
-----------------------------------------------------------
PUBLISHER'S NOTE TWO: In recent years, I have found myself publishing more and more posts on the application of artificial intelligence technology to policing, public safety, and the criminal justice process, not just in North America, but in countries all over the world, including China. Although I accept that properly applied science can play a positive role in our society, I have learned over the years that technologies introduced for the so-called public good, can eventually be used against the people they were supposed to benefit. As reporter Sieeka Khan writes in Science Times: "In 2017, researchers sent a letter to the secretary of the US Department of Homeland Security. The researchers expressed their concerns about a proposal to use the AI to determine whether someone who is seeking refuge in the US would become a positive and contributing member of society or if they are likely to become a threat or a terrorist. The other government uses of AI are also being questioned, such as the attempts at setting bail amounts and sentences on criminals, predictive policing and hiring government workers. All of these attempts have been shown to be prone to technical issues and a limit on the data can cause bias on their decisions as they will base it on gender, race or cultural background. Other AI technologies like automated surveillance, facial recognition and mass data collection are raising concerns about privacy, security, accuracy and fairness in a democratic society. As the executive order of Trump demonstrates, there is a massive interest in harnessing AI for its full, positive potential. But the dangers of misuse, bias and abuse, whether it is intentional or not, have the chance to work against the principles of international democracies. As the use of artificial intelligence grows, the potential for misuse, bias and abuse grows as well. The purpose of this 'technology' series, is to highlight the dangers of artificial intelligence - and to help readers make their own assessments as to whether these innovations will do more harm than good.
----------------------------------------------------------
QUOTE OF THE DAY: "This
technology is being put in place without any legislative oversight and
we need to hit the pause button,” NDP MP Charlie Angus told the Star on
Monday. The technology is “moving extremely fast,” said Angus, who
is examining the ethics of artificial intelligence as part of a House
of Commons Standing Committee on Access to Information, Privacy and
Ethics. In San Francisco, city officials
banned the use of the technology
by police and other agencies earlier this month, citing concerns about
potential misuse by government authorities. The city’s police department
had previously been using the tool, but stopped in 2017. One of the
city legislators told reporters that the move was about having security
without becoming a security state."
----------------------------------------------------------
STORY: "Toronto police have been using facial recognition technology for more than a year," by Kate Allen, Sciencer and Technology Reporter, and Wendy Gillis, Crime Reporter, published by The Toronto Star on May 28, 2019.
GIST: "Toronto
police have been using facial recognition technology for more than a
year — a tool police say increases the speed and efficiency of criminal
investigations and has led to arrests in major crimes including
homicides. But the emerging technology — which relies on
artificial intelligence — has generated enough privacy and civil liberty
concerns that the city of San Francisco, worried about police and state
overreach, recently became the first U.S. city to ban the tool. Toronto
police say that facial recognition technology is being used to compare
images of potential suspects captured on public or private cameras to
its internal database of approximately 1.5 million mugshots. According
to a report submitted by Chief Mark Saunders to the Toronto police
services board, the technology is generating leads in investigations,
particularly as a growing number of crimes are being captured on video
through surveillance cameras. Since the system was purchased in March
2018 — at a cost $451,718 plus annual maintenance and support fees —
officers have conducted 2,591 facial recognition searches. The report
was submitted in advance of Thursday’s board meeting. The goal of
purchasing the system was to identify suspects more efficiently and
quickly, including violent offenders. It will also help police conclude
major investigations with fewer resources and help tackle unsolved
crimes, Saunders said. Funding for the system was provided through a
provincial policing modernization grant. But
critics are wary of facial recognition technology for reasons including
its potential to be misused by police or other government agencies as
technological advancements outpace oversight. “This
technology is being put in place without any legislative oversight and
we need to hit the pause button,” NDP MP Charlie Angus told the Star on
Monday. The technology is “moving extremely fast,” said Angus, who
is examining the ethics of artificial intelligence as part of a House
of Commons Standing Committee on Access to Information, Privacy and
Ethics. In San Francisco, city officials
banned the use of the technology
by police and other agencies earlier this month, citing concerns about
potential misuse by government authorities. The city’s police department
had previously been using the tool, but stopped in 2017. One of the
city legislators told reporters that the move was about having security
without becoming a security state. According to Saunders’ report,
Toronto police ran 1,516 facial recognition searches between March and
December last year, using approximately 5,000 still and video images.
The system was able to generate potential mugshot matches for about 60
per cent of those images. About 80 per cent of those matches led to the
identification of criminal offenders. The total number of arrests
the technology has generated is undetermined, the report states, because
unlike fingerprint matches, the facial recognition tool only identifies
potential candidates and arrests are made only after further
investigation produces more evidence. “Many investigations were
successfully concluded due to the information provided to investigators,
including four homicides, multiple sexual assaults, a large number of
armed robberies and numerous shooting and gang related crimes,” Saunders
wrote. Other jurisdictions have come under fire for using facial
recognition on crowds in real-time, such as scanning attendees at major
sports events to identify the subjects of outstanding warrants and
arrest them on the spot. In emailed responses to questions by the
Star, Staff Inspector Stephen Harris, Forensic Identification Services,
said Toronto police have no plans to extend the use of facial
recognition technology beyond comparisons to its pre-existing mugshot
database. Harris and Saunders both emphasized that Toronto police does
not use real-time facial recognition and has no legal authorization to
do so. Last
year, Toronto police used the technology during their investigation
into serial killer Bruce McArthur. Investigators located what they
believed was a post-mortem image of an unknown man on McArthur’s
computer. Hoping to identify him, they used the software and found,
within their police database, a mugshot image of Dean Lisowick. In
documents filed with the courts during the McArthur probe, an
investigator notes that there were “undeniable physical similarities”
between the two images, including distinctive moles. A relative of
Lisowick’s later confirmed the match, and police charged McArthur in
Lisowick’s death three days later. Canadians need to have a discussion about what are legitimate uses of the technology — and what aren’t, said Angus. Police
using the technology to identify someone caught committing a crime on
surveillance footage is reasonable, Angus said, but measures need to be
put in place to stop what is determined to be unacceptable use, such as
real-time monitoring at a rally. Research has shown that facial
recognition technology has racialized false positive rates: some systems
are more likely to produce an inaccurate match for Black women than
white men, for example. “This strikes me as particularly
important, given all the concerns around carding and other kinds of
ethnic and racialized surveillance that have taken place by TPS in the
past,” said Chris Parsons, research associate at the University of
Toronto’s Citizen Lab. Asked by the Star about its false positive
rate overall and for different racial and ethnic groups, Harris said
that Toronto police “does not use facial recognition to make a positive
ID. Suspect identifications are only made after further investigation
and evidence leads us to that conclusion.” Civil liberties
advocates also appreciated that Toronto police were disclosing details
about their facial recognition technology, but wondered why it took so
long. The force began a year-long pilot project for the technology in
September, 2014, and sent four forensic officers for training at an FBI
division in West Virginia before that. “The fact that there has
been very little — virtually no — public conversation about the fact
that this is happening, despite the fact that they’ve been looking into
it for at least the past five years ... raises questions for me,” says
Brenda McPhail, director of the privacy, technology and surveillance
project at the Canadian Civil Liberties Association. “Being open
and accountable and transparent about the ways that new surveillance
technologies are being integrated into municipal policing is essential
to maintaining public trust, and to enable the kinds of conversations
that can help Toronto police understand the concerns of city residents.” Saunders’
report also says that the force conducted a Privacy Impact Assessment
for the technology in 2017. The system is only used in criminal
investigations, and the only officers with access to it are six
FBI-trained personnel. No other databases besides lawfully obtained
mugshots are used. Calgary Police Service was the first Canadian
force to begin using the technology. In 2014, it signed a contract for
its facial recognition software and said the technology would be used as
an “investigative tool” to compare photos and videos from video
surveillance against the service’s roughly 300,000 mugshot images. Asked if
images
captured by Toronto police’s body-worn cameras could be used with the
facial recognition system, Harris said investigators could only do so if
a suspect was on camera committing a criminal offence. In that case,
investigators would still have to get the court’s permission to use the
facial technology during the probe. The Toronto police board is scheduled to hear discussion of Saunders’ report Thursday."
The entire story can be read at:
https://www.thestar.com/news/gta/2019/05/28/toronto-police-chief-releases-report-on-use-of-facial-recognition-technology.html
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy: Publisher; The Charles Smith Blog.