Tuesday, December 16, 2025

Technology: Question of the day: Edmonton, Alberta Police have released an AI image to assist in identifying a deceased female. Ethical issues?, by Reporter Stephanie Swensrude, noting that: "The National Council of Canadian Muslims criticized the resulting image EPS released as racist because it created a “vague portrait” and a “generic image of a Black male,” a statement at the time read. “It is hard to overstate the absurdity of releasing a hypothetical, racialized portrait of a suspect to the public, while hoping such a tactic might lead to overall vigilance and perhaps an arrest,” the statement said. “In effect, the public is asked to ‘watch out’ for a person of a particular race, with some other physical traits thrown in as ranges (eg. height). It is racial profiling backed up by incomplete science.” EPS (Edmonton Police Service) has since apologized for releasing the 2022 photo."


PASSAGE OF THE DAY: " (Police spokesperson) Voordenhout said that when police have exhausted traditional investigative methods, a technologically enhanced image of a deceased person may be used to portray the individual. EPS consulted with the Office of the Chief Medical Examiner and with a forensic anthropologist before facial recognition experts in the digital forensics section created the AI-generated image. Voordenhout said neither the original photo of the deceased woman nor the police sketch were uploaded into the AI image generator; instead, the image was produced entirely through repeated prompts until it resembled the woman as accurately as possible."


STORY: "Police use of AI to create photo of dead woman ‘complicated’: Ethicist," by Reporter Stephanie Swensrude, published by Taproot, on November 27, 2025. (Stephanie Swensrude attended NAIT's radio and television program and has worked at CBC, CFJC in Kamloops, and 630 CHED.)


GIST: "An artificial intelligence and data ethicist said she feels the Edmonton Police Service’s recent use of AI to create a photo of an unidentified deceased woman raises several questions."

“I think it’s complicated, and I’m trying to think about it in a very holistic way,” said Katrina Ingram, CEO of Ethically Aligned AI, an Edmonton company that helps organizations and individuals build and deploy ethical AI solutions.

As the use of AI becomes more common across all industries and sectors, including policing, the ethics of how these tools are applied is a growing conversation.

Edmonton police said a woman’s body was found in a waste bin in downtown Edmonton in late December 2024. Officers released sketches of the woman and her tattoo, as well as stock images of her jacket and boots, in an attempt to confirm her identity in March 2025. In November, EPS then used AI to create an “image that is an approximate likeness of the deceased female” in hopes that it would generate tips about her identity.

Ingram said there are ethical questions about how AI tools are built and function, due to the data that they acquire to generate images and other content. “Should we use these tools? And there are even some questions about the lawfulness of the data that was acquired to build these tools, which really starts to raise questions for law enforcement agencies, because as a law enforcement agency, you shouldn’t use a tool that was unlawfully made,” Ingram said.

Police spokesperson Cheryl Voordenhout told Taproot that EPS doesn’t release operational details like the specific software that police use. She said the digital forensics team used an AI model that was ingested into a secure EPS platform where no data is transmitted externally, and that the source code is publicly accessible, fully transparent, and legal to use.

Ingram said she thinks this particular case is ethical because the police were motivated to identify a deceased person when a traditional sketch had produced no answers. “They turned to AI, hoping that the more realistic version might be helpful,” she said. “That context matters in terms of what they did.” Additionally, Ingram said, EPS members could confirm whether the AI-generated photo actually resembled the deceased person.

But Ingram contrasted the ethics in this case with an instance in 2022, when EPS used DNA found at a crime scene to produce an approximate image of a suspect in an unsolved sexual assault. Ingram has used this example when giving talks related to AI and racial bias.

The National Council of Canadian Muslims criticized the resulting image EPS released as racist because it created a “vague portrait” and a “generic image of a Black male,” a statement at the time read. “It is hard to overstate the absurdity of releasing a hypothetical, racialized portrait of a suspect to the public, while hoping such a tactic might lead to overall vigilance and perhaps an arrest,” the statement said. “In effect, the public is asked to ‘watch out’ for a person of a particular race, with some other physical traits thrown in as ranges (eg. height). It is racial profiling backed up by incomplete science.” EPS has since apologized for releasing the 2022 photo.

Voordenhout said that when police have exhausted traditional investigative methods, a technologically enhanced image of a deceased person may be used to portray the individual. EPS consulted with the Office of the Chief Medical Examiner and with a forensic anthropologist before facial recognition experts in the digital forensics section created the AI-generated image. Voordenhout said neither the original photo of the deceased woman nor the police sketch were uploaded into the AI image generator; instead, the image was produced entirely through repeated prompts until it resembled the woman as accurately as possible.

Ingram said another layer of complexity is added by the potential shock or trauma that the image might cause to the woman’s loved ones.

“I’m trying to imagine the relatives and friends of this person seeing that particular image and how they might feel about that. Because it is trying to be in service of identifying a deceased person in order to bring some closure to loved ones, there might be a sense of relief (because they) can recognize that person, but there might also be a sense of trauma in looking at the person in that way,” Ingram said. “It’s very complicated, and it’s hard to know how they will actually react.”

Other North American police services are using AI to generate photos of unidentified people. Earlier this year, the Calgary Police Service used AI to create an image of a man who was found deceased by the Bow River.

Meanwhile, a police service in Arizona is using AI to attempt to find suspects of crimes. To create the images, a victim describes the suspect to a sketch artist as usual, but then the sketch is put into an AI image generator. The artist works with the victim to tweak the AI image to match what the victim remembers.

Developers have created a platform to help create forensic sketches from scratch. But an ethicist told Vice in 2023, when the software was created, that using AI in police forensics can be dangerous, as it can reinforce existing racial and gender biases.

“The problem with traditional forensic sketches is not that they take time to produce (which seems to be the only problem that this AI forensic sketch program is trying to solve). The problem is that any forensic sketch is already subject to human biases and the frailty of human memory,” Jennifer Lynch, of the Electronic Frontier Foundation, said. “AI can’t fix those human problems, and this particular program will likely make them worse through its very design.""

The entire story can be read at:

https://edmonton.taproot.news/news/2025/11/27/police-use-of-ai-to-create-photo-of-dead-woman-complicated-ethicist

PUBLISHER'S NOTE:  I am monitoring this case/issue/resource. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic"  section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher: The Charles Smith Blog.

SEE BREAKDOWN OF  SOME OF THE ON-GOING INTERNATIONAL CASES (OUTSIDE OF THE CONTINENTAL USA) THAT I AM FOLLOWING ON THIS BLOG,  AT THE LINK BELOW:  HL:

https://www.blogger.com/blog/post/edit/120008354894645705/4704913685758792985

———————————————————————————————

FINAL WORD:  (Applicable to all of our wrongful conviction cases):  "Whenever there is a wrongful conviction, it exposes errors in our criminal legal system, and we hope that this case — and lessons from it — can prevent future injustices."

Lawyer Radha Natarajan:

Executive Director: New England Innocence Project;

—————————————————————————————————

FINAL, FINAL WORD: "Since its inception, the Innocence Project has pushed the criminal legal system to confront and correct the laws and policies that cause and contribute to wrongful convictions.   They never shied away from the hard cases — the ones involving eyewitness identifications, confessions, and bite marks. Instead, in the course of presenting scientific evidence of innocence, they've exposed the unreliability of evidence that was, for centuries, deemed untouchable." So true!


Christina Swarns: Executive Director: The Innocence Project;

-------------------------------------------------------------------