REVIEW: "Is it truth? Is it fiction? Hot Docs films find Hollywood counterparts," by Toronto Star film reviewed Peter Howell, published on April 30, 2017.
GIST: "Pre-Crime, one of the buzzed-about documentaries at Hot Docs 2017, opens with a reference to Steven Spielberg’s Minority Report, a fictional 2002 movie about invasive police methods. And well it might. Pre-Crime, by Germany’s Matthias Heeder and Monika Hielscher, takes its title and theme from the sinister mind-reading technique called “Precrime” in Spielberg’s film. Cops use it to arrest people before they break the law, privacy rights and due process be damned. The scary thing is that while Spielberg’s movie — adapted from a 1956 Phillip K. Dick short story — was science fiction set in years to come, Pre-Crime is science fact set in the here and now. Police worldwide are now busily using computer algorithms to stalk not just potential criminals but also past victims of crime, with outcomes both good and bad. The divide between documentary fact and dramatic fiction has never seemed flimsier, especially at this year’s Hot Docs festival, now on. Parallels between real life and classic Hollywood narratives can be drawn in multiple instances, sometimes worrisomely so: The future is now in Pre-Crime’s awesomely alarming canvass of police agencies around the globe that use advanced satellite surveillance, closed-circuit cameras and computer algorithms to calculate potential crime zones and the people associated with them. Chicago has its “Heat List,” Hamburg has a software program called “Beware” and London goes full sci-fi with its “Matrix” system. We also learn how the FBI and other covert agencies count “likes” on social media to suss out prejudices. All are designed to spot trouble before it happens, and it seems on the surface to be entirely altruistic — who wouldn’t want to stop a budding terrorist? But privacy and civil right advocates warn of abuse, which is already occurring: Pre-Crime interviews two men, one in the U.S. and one in England, whose lives have been made hell by police surveillance. There’s a huge potential for computers misidentifying people and misreading intentions because, as one expert puts it, “code doesn’t have a conscience.” In Spielberg’s neo-noir thriller Minority Report, set in the Washington, D.C. of 2054, Tom Cruise heads up a police Precrime unit that uses psychics (called “precogs”) to predict future criminals, who are then apprehended with the aid of advanced computer and 3D mapping technology. But when Cruise’s character is named by the precogs as a future murderer, possibly due to system tampering, he’s forced to flee and fight to prove his innocence of a crime that hasn’t even happened."
https://www.thestar.com/entertainment/movies/2017/04/28/is-it-truth-is-it-fiction-hot-docs-films-find-hollywood-counterparts.html
STORY: "Taser will use police body camera videos to anticipate criminal activity, by Ava Kofman, published by The Intercept on April 30, 2017.
GIST: When civil liberties advocates
discuss the dangers of new policing technologies, they often point to
sci-fi films like “RoboCop” and “Minority Report” as cautionary tales.
In “RoboCop,” a
massive corporation purchases Detroit’s entire police department. After
one of its officers gets fatally shot on duty, the company sees an
opportunity to save on labor costs by reanimating the officer’s body
with sleek weapons, predictive analytics, facial recognition, and the
ability to record and transmit live video. Although intended as a grim allegory of the pitfalls of relying on
untested, proprietary algorithms to make lethal force decisions,
“RoboCop” has long been taken by corporations as a roadmap. And no
company has been better poised than Taser International, the world’s
largest police body camera vendor, to turn the film’s ironic vision into
an earnest reality.
In 2010, Taser’s longtime vice president Steve Tuttle “proudly predicted” to GQ that once police can search a crowd for outstanding warrants using real-time face recognition, “every cop will be RoboCop.” Now Taser has announced that it will provide any police department in the nation with free body cameras, along with a year of free “data storage, training, and support.” The company’s goal is not just to corner the camera market, but to dramatically increase the video streaming into its servers. With an estimated one-third of departments using body cameras, police officers have been generating millions of hours of video footage. Taser stores terabytes of such video on Evidence.com, in private servers, operated by Microsoft, to which police agencies must continuously subscribe for a monthly fee. Data from these recordings is rarely analyzed for investigative purposes, though, and Taser — which recently rebranded itself as a technology company and renamed itself “Axon” — is hoping to change that. Taser has started to get into the business of making sense of its enormous archive of video footage by building an in-house “AI team.” In February, the company acquired a computer vision startup called Dextro and a computer vision team from Fossil Group Inc. Taser says the companies will allow agencies to automatically redact faces to protect privacy, extract important information, and detect emotions and objects — all without human intervention. This will free officers from the grunt work of manually writing reports and tagging videos, a Taser spokesperson wrote in an email. “Our prediction for the next few years is that the process of doing paperwork by hand will begin to disappear from the world of law enforcement, along with many other tedious manual tasks.” Analytics will also allow departments to observe historical patterns in behavior for officer training, the spokesperson added. “Police departments are now sitting on a vast trove of body-worn footage that gives them insight for the first time into which interactions with the public have been positive versus negative, and how individuals’ actions led to it.” But looking to the past is just the beginning: Taser is betting that its artificial intelligence tools might be useful not just to determine what happened, but to anticipate what might happen in the future. “We’ve got all of this law enforcement information with these videos, which is one of the richest treasure troves you could imagine for machine learning,” Taser CEO Rick Smith told PoliceOne in an interview about the company’s AI acquisitions. “Imagine having one person in your agency who would watch every single one of your videos — and remember everything they saw — and then be able to process that and give you the insight into what crimes you could solve, what problems you could deal with. Now, that’s obviously a little further out, but based on what we’re seeing in the artificial intelligence space, that could be within five to seven years.” As video analytics and machine vision have made rapid gains in recent years, the future long dreaded by privacy experts and celebrated by technology companies is quickly approaching. No longer is the question whether artificial intelligence will transform the legal and lethal limits of policing, but how and for whose profits.........Taser isn’t the only company selling agencies on its powers of speculation. A spokesperson for the Russian company Ntechlab told me that its high-performing facial recognition algorithm is able to detect “abnormal and suspicious behavior of people in certain areas.” Several major face recognition companies have already been teaching their systems to detect anomalous behaviors in crowds. Earlier this year, IBM, which has spent over $14 billion on predictive policing, advertised that its Deep Learning Engine could pinpoint the location and identity of suspects in real time. And for the last several years, researchers funded by the Defense Advanced Research Projects Agency have been developing “automated suspicion algorithms” to predict and analyze behavior from videos, text, and online images. But as the market leader for video recording hardware, having relationships with an estimated 17,000 of the country’s 18,000 police departments, Taser’s research investments have an outsized influence on law enforcement tactics. In an interview in Taser’s future of policing report, a senior data architect at Microsoft envisions a future in which officers receive alerts when “an individual has a known criminal record, or propensity to violence. Even if [the suspect] has not yet adopted a threatening posture, it heightens the overall threshold of awareness.” Taser CEO Rick Smith discussed a similar vision in a recent FastCompany profile, explaining that real-time artificial intelligence technology could have aided the officer who killed Philando Castile, the 32-year-old African-American man driving with his girlfriend and her 4-year-old daughter, by alerting him to the fact that Castile had a gun license and no violent criminal record. Legal experts and surveillance watchdogs caution, however, that any company that automates recommendations about threat assessments and suspicions may transform policing tactics for the worse. Hamid Khan, lead organizer for the Stop LAPD Spying Coalition, contends that feeding police information in real time about an individual’s prior records may only encourage more aggressive conduct with suspects. “We don’t have to go very far into deep learning,” he said, for evidence of this phenomena. “We just have to look at the numbers that already exist for suspicious activity reporting, which doesn’t even require [advanced] analytics.” He noted that when the LAPD’s Suspicious Activity Reporting program, which relied on analog human tips, was audited by the city’s inspector general, it determined that black women residents were being disproportionately flagged. The problem with any suspicious activity reporting, automated or not, is that suspicion always lies in the eye of the beholder. As The Intercept reported in February, the Transportation Security Administration’s own research showed that the agency’s program to detect suspicious behavior in travelers was unscientific, unreliable, and dependent on racial stereotypes. Christoph Musik, an expert in computer vision from the University of Vienna, has written extensively about the human assumptions built into such systems. Hunches are always subjective, he points out, unlike evaluating the proposition of whether or not an object is a cat. “It is extremely difficult to formulate universal laws of behavior or suspicious behavior, especially if we focus on everyday behavior on a micro level,” Musk wrote in an email. “‘Smart’ or ‘intelligent’ systems claiming to recognize suspicious behavior are not as objective or neutral as they [seem].” Predictions aside, the mere ability to trawl for evidence from body-worn camera footage also widens the range of “potentially suspicious persons” who can be contacted by law enforcement, according to Joh, the legal scholar of policing. “It’s a pretty radical expansion of the kind of discretion law enforcement has.” At such an indiscriminate scale, all kinds of insights and individuals get swept into an automated investigation process. “Once you’ve created a giant video database, it’s possible to search and re-search it, it’s not clear that there are any legal limits,” she said, since the Fourth Amendment focuses on the point of collection. “Generally speaking, there aren’t too many rules on what the police can do after they collect the information.” Private Predictions: Despite prominent civil rights groups highlighting the need for comprehensive policies, state and local level legislation has lagged in regulating who can access body-worn camera footage, how long it is stored, and who gets to see it. But the biggest impediment to making sure body-worn camera footage remains accountable might be the manufacturers themselves. Nondisclosure agreements allow private companies like Taser to defend their proprietary computing systems from public scrutiny, Joh explained. “Typically we think we have oversight into what police can do,” said Joh. “Now we have third-party intermediary, they have a kind of privacy shield, they’re not subject to state public record laws, and they have departments sign contracts that they are going to keep this secret.” As privately owned policing tactics become increasingly black-boxed, citizens will have no recourse to uncover how they ended up on their city’s list of suspicious persons or the logic guiding an algorithm’s decisions. In “RoboCop,” for instance, a secret rule prohibits the robot from arresting any of the owner-corporation’s board members. Or take the case of the criminal justice consulting firm Northpointe. A ProPublica investigation of Northpointe’s algorithm used to calculate the risk of recidivism was shown to be twice as likely to incorrectly decide black defendants were at a higher risk of committing future crimes. But while reporters were able to analyze the questionnaires used by the company, which disputed ProPublica’s findings, they were unable to analyze Northpointe’s proprietary software. Because the algorithms for these systems are often not disclosed, a judge would have no way of evaluating the likelihood of a false match when presented with investigative evidence about a suspect’s crime. Civil liberties experts find this especially disconcerting given the fact that machine learning systems make probabilistic, rather than binary, judgments. Amazon mistakenly predicting that you desire more toilet paper has vastly different implications for individual liberty than a private technology company’s cloud mistakenly telling an officer, with indefinite certainty, to react lethally to a seemingly aggressive suspect. “Body cameras are really just a story about private influence on public policing,” Joh said. “Whoever captures the audience first wins. And Taser is capturing the entire market. They get to shape the language that we use, they get to set the agenda, they get to say ‘this is possible’ and therefore the police can do it.”"
The entire story can be found at:
https://theintercept.com/2017/04/30/taser-will-use-police-body-camera-videos-to-anticipate-criminal-activity/
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/ charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot. com/2011/05/charles-smith- blog-award-nominations.html Please
send any comments or information on other cases and issues of interest
to the readers of this blog to: hlevy15@gmail.com. Harold Levy;
Publisher; The Charles Smith Blog;
In 2010, Taser’s longtime vice president Steve Tuttle “proudly predicted” to GQ that once police can search a crowd for outstanding warrants using real-time face recognition, “every cop will be RoboCop.” Now Taser has announced that it will provide any police department in the nation with free body cameras, along with a year of free “data storage, training, and support.” The company’s goal is not just to corner the camera market, but to dramatically increase the video streaming into its servers. With an estimated one-third of departments using body cameras, police officers have been generating millions of hours of video footage. Taser stores terabytes of such video on Evidence.com, in private servers, operated by Microsoft, to which police agencies must continuously subscribe for a monthly fee. Data from these recordings is rarely analyzed for investigative purposes, though, and Taser — which recently rebranded itself as a technology company and renamed itself “Axon” — is hoping to change that. Taser has started to get into the business of making sense of its enormous archive of video footage by building an in-house “AI team.” In February, the company acquired a computer vision startup called Dextro and a computer vision team from Fossil Group Inc. Taser says the companies will allow agencies to automatically redact faces to protect privacy, extract important information, and detect emotions and objects — all without human intervention. This will free officers from the grunt work of manually writing reports and tagging videos, a Taser spokesperson wrote in an email. “Our prediction for the next few years is that the process of doing paperwork by hand will begin to disappear from the world of law enforcement, along with many other tedious manual tasks.” Analytics will also allow departments to observe historical patterns in behavior for officer training, the spokesperson added. “Police departments are now sitting on a vast trove of body-worn footage that gives them insight for the first time into which interactions with the public have been positive versus negative, and how individuals’ actions led to it.” But looking to the past is just the beginning: Taser is betting that its artificial intelligence tools might be useful not just to determine what happened, but to anticipate what might happen in the future. “We’ve got all of this law enforcement information with these videos, which is one of the richest treasure troves you could imagine for machine learning,” Taser CEO Rick Smith told PoliceOne in an interview about the company’s AI acquisitions. “Imagine having one person in your agency who would watch every single one of your videos — and remember everything they saw — and then be able to process that and give you the insight into what crimes you could solve, what problems you could deal with. Now, that’s obviously a little further out, but based on what we’re seeing in the artificial intelligence space, that could be within five to seven years.” As video analytics and machine vision have made rapid gains in recent years, the future long dreaded by privacy experts and celebrated by technology companies is quickly approaching. No longer is the question whether artificial intelligence will transform the legal and lethal limits of policing, but how and for whose profits.........Taser isn’t the only company selling agencies on its powers of speculation. A spokesperson for the Russian company Ntechlab told me that its high-performing facial recognition algorithm is able to detect “abnormal and suspicious behavior of people in certain areas.” Several major face recognition companies have already been teaching their systems to detect anomalous behaviors in crowds. Earlier this year, IBM, which has spent over $14 billion on predictive policing, advertised that its Deep Learning Engine could pinpoint the location and identity of suspects in real time. And for the last several years, researchers funded by the Defense Advanced Research Projects Agency have been developing “automated suspicion algorithms” to predict and analyze behavior from videos, text, and online images. But as the market leader for video recording hardware, having relationships with an estimated 17,000 of the country’s 18,000 police departments, Taser’s research investments have an outsized influence on law enforcement tactics. In an interview in Taser’s future of policing report, a senior data architect at Microsoft envisions a future in which officers receive alerts when “an individual has a known criminal record, or propensity to violence. Even if [the suspect] has not yet adopted a threatening posture, it heightens the overall threshold of awareness.” Taser CEO Rick Smith discussed a similar vision in a recent FastCompany profile, explaining that real-time artificial intelligence technology could have aided the officer who killed Philando Castile, the 32-year-old African-American man driving with his girlfriend and her 4-year-old daughter, by alerting him to the fact that Castile had a gun license and no violent criminal record. Legal experts and surveillance watchdogs caution, however, that any company that automates recommendations about threat assessments and suspicions may transform policing tactics for the worse. Hamid Khan, lead organizer for the Stop LAPD Spying Coalition, contends that feeding police information in real time about an individual’s prior records may only encourage more aggressive conduct with suspects. “We don’t have to go very far into deep learning,” he said, for evidence of this phenomena. “We just have to look at the numbers that already exist for suspicious activity reporting, which doesn’t even require [advanced] analytics.” He noted that when the LAPD’s Suspicious Activity Reporting program, which relied on analog human tips, was audited by the city’s inspector general, it determined that black women residents were being disproportionately flagged. The problem with any suspicious activity reporting, automated or not, is that suspicion always lies in the eye of the beholder. As The Intercept reported in February, the Transportation Security Administration’s own research showed that the agency’s program to detect suspicious behavior in travelers was unscientific, unreliable, and dependent on racial stereotypes. Christoph Musik, an expert in computer vision from the University of Vienna, has written extensively about the human assumptions built into such systems. Hunches are always subjective, he points out, unlike evaluating the proposition of whether or not an object is a cat. “It is extremely difficult to formulate universal laws of behavior or suspicious behavior, especially if we focus on everyday behavior on a micro level,” Musk wrote in an email. “‘Smart’ or ‘intelligent’ systems claiming to recognize suspicious behavior are not as objective or neutral as they [seem].” Predictions aside, the mere ability to trawl for evidence from body-worn camera footage also widens the range of “potentially suspicious persons” who can be contacted by law enforcement, according to Joh, the legal scholar of policing. “It’s a pretty radical expansion of the kind of discretion law enforcement has.” At such an indiscriminate scale, all kinds of insights and individuals get swept into an automated investigation process. “Once you’ve created a giant video database, it’s possible to search and re-search it, it’s not clear that there are any legal limits,” she said, since the Fourth Amendment focuses on the point of collection. “Generally speaking, there aren’t too many rules on what the police can do after they collect the information.” Private Predictions: Despite prominent civil rights groups highlighting the need for comprehensive policies, state and local level legislation has lagged in regulating who can access body-worn camera footage, how long it is stored, and who gets to see it. But the biggest impediment to making sure body-worn camera footage remains accountable might be the manufacturers themselves. Nondisclosure agreements allow private companies like Taser to defend their proprietary computing systems from public scrutiny, Joh explained. “Typically we think we have oversight into what police can do,” said Joh. “Now we have third-party intermediary, they have a kind of privacy shield, they’re not subject to state public record laws, and they have departments sign contracts that they are going to keep this secret.” As privately owned policing tactics become increasingly black-boxed, citizens will have no recourse to uncover how they ended up on their city’s list of suspicious persons or the logic guiding an algorithm’s decisions. In “RoboCop,” for instance, a secret rule prohibits the robot from arresting any of the owner-corporation’s board members. Or take the case of the criminal justice consulting firm Northpointe. A ProPublica investigation of Northpointe’s algorithm used to calculate the risk of recidivism was shown to be twice as likely to incorrectly decide black defendants were at a higher risk of committing future crimes. But while reporters were able to analyze the questionnaires used by the company, which disputed ProPublica’s findings, they were unable to analyze Northpointe’s proprietary software. Because the algorithms for these systems are often not disclosed, a judge would have no way of evaluating the likelihood of a false match when presented with investigative evidence about a suspect’s crime. Civil liberties experts find this especially disconcerting given the fact that machine learning systems make probabilistic, rather than binary, judgments. Amazon mistakenly predicting that you desire more toilet paper has vastly different implications for individual liberty than a private technology company’s cloud mistakenly telling an officer, with indefinite certainty, to react lethally to a seemingly aggressive suspect. “Body cameras are really just a story about private influence on public policing,” Joh said. “Whoever captures the audience first wins. And Taser is capturing the entire market. They get to shape the language that we use, they get to set the agenda, they get to say ‘this is possible’ and therefore the police can do it.”"
The entire story can be found at:
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/