Wednesday, February 7, 2018

Technology: Image manipulation: (Coming soon to your neighbourhood courthouse? HL): New York Times op-ed paints a scary picture of a "hackable political future" in which unscrupulous activists have "the power to create “video” framing real people for things they’ve never actually done."..."One harrowing potential eventuality: Fake video and audio may become so convincing that it can’t be distinguished from real recordings, rendering audio and video evidence inadmissible in court."


QUOTE OF THE DAY: "One harrowing potential eventuality: Fake video and audio may become so convincing that it can’t be distinguished from real recordings, rendering audio and video evidence inadmissible in court."

PASSAGE OF THE DAY: "A program called Face2Face, developed at Stanford, films one person speaking, then manipulates that person’s image to resemble someone else’s. Throw in voice manipulation technology, and you can literally make anyone say anything — or at least seem to. The technology isn’t quite there; Princess Leia was a little wooden, if you looked carefully. But it’s closer than you might think."

COMMENTARY:  "Our Hackable Political Future, by reporters Henry J. Farrell and Rick Perlsteinfeb, published by The New York Times on February 4, 2018. (Henry J. Farrell is a professor of political science and international affairs at the George Washington University. Rick Perlstein is the author, most recently, of “The Invisible Bridge: The Fall of Nixon and the Rise of Reagan.”)

GIST: "Imagine it is the spring of 2019. A bottom-feeding website, perhaps tied to Russia, “surfaces” video of a sex scene starring an 18-year-old Kirsten Gillibrand. It is soon debunked as a fake, the product of a user-friendly video application that employs generative adversarial network technology to convincingly swap out one face for another. It is the summer of 2019, and the story, predictably, has stuck around — part talk-show joke, part right-wing talking point. “It’s news,” political journalists say in their own defense. “People are talking about it. How can we not?” Then it is fall. The junior senator from New York State announces her campaign for the presidency. At a diner in New Hampshire, one “low information” voter asks another: “Kirsten What’s-her-name? She’s running for president? Didn’t she have something to do with pornography?” Welcome to the shape of things to come. In 2016 Gareth Edwards, the director of the Star Wars film “Rogue One,” was able to create a scene featuring a young Princess Leia by manipulating images of Carrie Fisher as she looked in 1977. Mr. Edwards had the best hardware and software a $200 million Hollywood budget could buy. Less than two years later, images of similar quality can be created with software available for free download on Reddit. That was how a faked video supposedly of the actress Emma Watson in a shower with another woman ended up on the website Celeb Jihad. Programs like these have many legitimate applications. They can help computer-security experts probe for weaknesses in their defenses and help self-driving cars learn how to navigate unusual weather conditions. But as the novelist William Gibson once said, “The street finds its own uses for things.” So do rogue political actors. The implications for democracy are eye-opening. The conservative political activist James O’Keefe has created a cottage industry manipulating political perceptions by editing footage in misleading ways. In 2018, low-tech editing like Mr. O’Keefe’s is already an anachronism: Imagine what even less scrupulous activists could do with the power to create “video” framing real people for things they’ve never actually done. One harrowing potential eventuality: Fake video and audio may become so convincing that it can’t be distinguished from real recordings, rendering audio and video evidence inadmissible in court. A program called Face2Face, developed at Stanford, films one person speaking, then manipulates that person’s image to resemble someone else’s. Throw in voice manipulation technology, and you can literally make anyone say anything — or at least seem to. The technology isn’t quite there; Princess Leia was a little wooden, if you looked carefully. But it’s closer than you might think. And even when fake video isn’t perfect, it can convince people who want to be convinced, especially when it reinforces offensive gender or racial stereotypes. Another harrowing potential is the ability to trick the algorithms behind self-driving cars to not recognize traffic signs. Computer scientists have shown that nearly invisible changes to a stop sign can fool algorithms into thinking it says yield instead. Imagine if one of these cars contained a dissident challenging a dictator. In 2007, Barack Obama’s political opponents insisted that footage existed of Michelle Obama ranting against “whitey.” In the future, they may not have to worry about whether it actually existed. If someone called their bluff, they may simply be able to invent it, using data from stock photos and pre-existing footage. The next step would be one we are already familiar with: the exploitation of the algorithms used by social media sites like Twitter and Facebook to spread stories virally to those most inclined to show interest in them, even if those stories are fake. It might be impossible to stop the advance of this kind of technology. But the relevant algorithms here aren’t only the ones that run on computer hardware. They are also the ones that undergird our too easily hacked media system, where garbage acquires the perfumed scent of legitimacy with all too much ease."
The entire story can be found at:
https://www.nytimes.com/2018/02/04/opinion/hacking-politics-future.html

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com. Harold Levy; Publisher; The Charles Smith Blog."