Monday, December 17, 2018

Back-in action: On-going: "Computerized facial recognition." Part One)...Author David Owen: The New Yorker: Should We Be Worried About Computerized Facial Recognition?..."People who have grown up with smartphones and social media may think that the very concept of personal privacy has become quaintly irrelevant, but there are reasons for even habitual oversharers to be alarmed. Faces, unlike fingerprints or iris patterns, can easily be recorded without the knowledge of the people they belong to, and that means that facial recognition can be used for remote surveillance. “We would be horrified if law-enforcement agents were to walk through a protest demanding that everybody show their identification,” Garvie said. “Yet that’s what face recognition enables.” Computer-vision systems potentially allow cops and employers to track behaviors and activities that are none of their business, such as where you hang out after work, which fund-raisers you attend, and what that slight tremor in your hand (recorded by the camera in the elevator that you ride to your office every morning) portends about the size of your future medical claims. In October, Tim Cook, the C.E.O. of Apple, while speaking at a privacy conference in Brussels, said, “Our own information, from the everyday to the deeply personal, is being weaponized against us with military efficiency.”"


PASSAGE OF THE DAY..."Garvie doesn’t doubt that facial recognition has legitimate uses in law enforcement, just as wiretaps and personal searches do. But misuse is inevitable. “Right now, quite literally, there’s no such thing as face-recognition abuse, in one sense, because there are really no laws governing its use by police,” she said. If your face appears in an accessible database, as it probably does, you’re effectively a suspect every time it’s searched. And you don’t have to be a cop to have access to the millions of photos on social-media sites—many of which are labelled automatically. (This is less of a threat to happen in Canada and Europe, where comprehensive privacy laws have prevented social-media sites from even offering automated photo-tagging.) Garvie and her colleagues have written a fourteen-page model bill intended to regulate the use of facial-recognition technology in law enforcement. Among many other things, it would require the custodians of arrest-photo databases to regularly purge images of people who are not later convicted of whatever act it was that prompted their arrest. Their first version of the bill was published in 2016; no legislature has adopted it. People who have grown up with smartphones and social media may think that the very concept of personal privacy has become quaintly irrelevant, but there are reasons for even habitual oversharers to be alarmed. Faces, unlike fingerprints or iris patterns, can easily be recorded without the knowledge of the people they belong to, and that means that facial recognition can be used for remote surveillance. “We would be horrified if law-enforcement agents were to walk through a protest demanding that everybody show their identification,” Garvie said. “Yet that’s what face recognition enables.”"

--------------------------------------------------------------------

PUBLISHER'S NOTE:

Two things got me thinking about rapidly evolving facial recognition technology- and its implications in criminal justice systems -  earlier today: Thing one: A brief  news report that singer Taylor Swift used facila recognition at a Rose Bowl concert to help identify stalkers. Thing two: A very informative New Yorker article  which asks whether we should be worried about computerized facial recognition. A theme of the the article is that the technology could revolutionize policing, medicine, even agriculture—but its applications can easily be weaponized. One thought. If Taylor Swift can secretly scan concertgoers' faces, and then shoot  shot those pictures back to a "command post" in Nashville, Tennessee, how likely could it be that courthouse security personnel in North America  and elsewhere begin secretly scanning the faces of all people entering courthouses - or even doing so overtly. This could well discourage people from entering courthouses (where they  have a choice) and having their image scanned and compared against  massive data-bases (with all the risk of error and bias that entails)  - under the watchful eyes of the police. Oooops. I don't want to give them ideas.  My worry is that they don't need them.

Harold Levy: Publisher; The Charles Smith Blog.

----------------------------------------------------------------------

THING ONE: 

STORY: "Taylor Swift Used Facial Recognition Technology at Rose Bowl Concert to Help ID Stalkers," by reporter Jenn Gidman, published by Newser  on December 13, 2018.

 GIST: "She knew you were trouble when you walked in—and so she installed facial recognition technology to flag you. Per Rolling Stone, Taylor Swift made use in May of a special kiosk at her California Rose Bowl show that secretly scanned concertgoers' faces, then shot those pictures back to a "command post" in Nashville, Tenn. There, the faces were compared with those of "hundreds" of Swift's known stalkers. Intel on the surreptitious spying kiosk, which was showing clips of the pop star rehearsing to lure spectators so their faces could then be photographed, comes via a security expert who says he was invited to watch a demo of the system at the Rose Bowl show by the kiosk manufacturer; Swift's own reps are keeping tight-lipped about it. Mashable calls it a "creepy" yet "understandable" use of the technology, considering the trouble Swift has had with alleged stalkers in the past. Still, it raises a few questions, including whether the photos will be kept, for how long, and what else they might be used for; per Quartz, it's unclear whether any stalkers were identified at the Swift event. The Verge notes that concerts are typically private events, meaning organizers can legally set up practically any kind of surveillance. Swift's fans aren't the only ones who may be undergoing Terminator-like scanning: Rolling Stone notes facial recognition technology is increasingly being used at arenas and stadiums, not only for safety, but also for efficiency—by keeping tabs on how quickly guests move through a venue, event organizers can set up systems to keep the flow streamlined. "It holds a lot of promise," says a rep for Ticketmaster, which plans on tapping the tech. "We're just being very careful about where and how we implement it." (One person who may not be attending a Swift concert anytime soon: President Trump.)"


The entire story  can be read at:
http://www.newser.com/story/268521/taylor-swift-scanned-fans-faces-at-recent-show.html

THING 2:


STORY:"Should we be worried about computerized facial recognition," by David Owen, published in The New Yorker. " (David Owen is a staff writer and the author of “Where the Water Goes: Life and Death Along the Colorado River,” based on his article “Where the River Runs Dry,” which appeared in the May 25, 2015, issue of the magazine.)

GIST: Here are several excerpts from this lengthy, insightful article. But these are just a taste. It is worth reading  word by word. HL); "In 2016, Joy Buolamwini, a researcher at the M.I.T. Media Lab, gave a TEDx talk in Boston. Buolamwini is black. In her presentation, she played a video showing that a common facial-recognition algorithm didn’t even recognize her face as a face—until she covered it with a featureless white plastic mask. (The same thing happened to her in college, when she had to “borrow” a white roommate to complete an artificial-intelligence assignment.) Computer-vision algorithms, she explained, inevitably recapitulate the conscious and unconscious biases of the people who create and train them, a defect that she calls the “coded gaze.” I spoke with her recently. “There’s an assumption of machine neutrality,” she said. “And there’s actually a hope that the technology we create will be less biased than we are. But we don’t equip these systems to overcome our prejudices.” Gender Shades, a project she directed at M.I.T., showed that dark-skinned females are far more likely than light-skinned males to be misidentified by commercial facial-analysis systems. She has founded the Algorithmic Justice League, which employs multiple approaches to identifying and eliminating biases in artificial intelligence, and, with a grant from the Ford Foundation, she created “A.I., Ain’t I a Woman?,” a poetic multimedia presentation. In 2012, the New York Police Department implemented what it calls the Domain Awareness System, which it developed in partnership with Microsoft (and from which it earns a royalty when other cities adopt it). The system uses thousands of public-facing surveillance cameras, including many owned by private businesses. One afternoon in September, I sat on a bench in front of One Police Plaza, the N.Y.P.D.’s headquarters, with Clare Garvie, who is a senior associate at the Center on Privacy and Technology, at Georgetown Law School, in Washington. From where we were sitting, I could see two cops in a brick security booth. Like most bored people nowadays, they were staring at their phones, but their inattention didn’t matter, because the plaza was being watched by a dozen or so building-mounted cameras, most of which looked like larger versions of the ones that Cainthus uses on cows: dark domes that resembled light fixtures. I asked Garvie what the police were doing with whatever the cameras were recording, and she said there was no way to know. “The N.Y.P.D. has resisted our efforts to get any information about their technology,” she said. It was only after the center sued the department that it began to receive documents that it had initially requested more than two years earlier. By contrast, San Diego publishes reports on the facial-recognition system used by its police and holds public meetings about it. Last year, the Seattle City Council passed a comprehensive ordinance requiring disclosure of the city’s surveillance technologies; this year, it voted to physically dismantle a network of video cameras and cell-phone trackers, installed in 2013, that was like a smaller version of the Domain Awareness System. But most big cities don’t reveal much about what they’re up to, and no federal law requires them to do so. Chicago and Los Angeles are as secretive as New York, and have put off attempts by Garvie’s group, the American Civil Liberties Union, and other organizations to learn more. Garvie is thirty-one. She majored in political science and human rights at Barnard, earned a law degree at Georgetown, and stayed on, after graduation, as a law fellow. In 2016, she was the lead author of “The Perpetual Line-​Up: Unregulated Police Face Recognition in America,” a study whose title refers to the fact that many states allow police departments to search their databases of mug shots and driver’s-license photos. Garvie doesn’t doubt that facial recognition has legitimate uses in law enforcement, just as wiretaps and personal searches do. But misuse is inevitable. “Right now, quite literally, there’s no such thing as face-recognition abuse, in one sense, because there are really no laws governing its use by police,” she said. If your face appears in an accessible database, as it probably does, you’re effectively a suspect every time it’s searched. And you don’t have to be a cop to have access to the millions of photos on social-media sites—many of which are labelled automatically. (This is less of a threat to happen in Canada and Europe, where comprehensive privacy laws have prevented social-media sites from even offering automated photo-tagging.) Garvie and her colleagues have written a fourteen-page model bill intended to regulate the use of facial-recognition technology in law enforcement. Among many other things, it would require the custodians of arrest-photo databases to regularly purge images of people who are not later convicted of whatever act it was that prompted their arrest. Their first version of the bill was published in 2016; no legislature has adopted it. People who have grown up with smartphones and social media may think that the very concept of personal privacy has become quaintly irrelevant, but there are reasons for even habitual oversharers to be alarmed. Faces, unlike fingerprints or iris patterns, can easily be recorded without the knowledge of the people they belong to, and that means that facial recognition can be used for remote surveillance. “We would be horrified if law-enforcement agents were to walk through a protest demanding that everybody show their identification,” Garvie said. “Yet that’s what face recognition enables.” Computer-vision systems potentially allow cops and employers to track behaviors and activities that are none of their business, such as where you hang out after work, which fund-raisers you attend, and what that slight tremor in your hand (recorded by the camera in the elevator that you ride to your office every morning) portends about the size of your future medical claims. In October, Tim Cook, the C.E.O. of Apple, while speaking at a privacy conference in Brussels, said, “Our own information, from the everyday to the deeply personal, is being weaponized against us with military efficiency.”"
PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher; The Charles Smith Blog;