Thursday, February 21, 2019

Technology: Deepfakes: Seeing is believing? New Yorker article by Joshua Rothman should make us very nervous about the potentially dangerous impact of artificial intelligence on the judicial process. The article is headed, "In the Age of A.I., Is Seeing Still Believing? Advances in digital imagery could deepen the fake-news crisis—or help us get out of it."


STORY:" by Joshua Rothman, published by The New Yorker on November 12, 2018. (Joshua Rothman is The New Yorker’s archive editor. He is also a frequent contributor to newyorker.com, where he writes about books and ideas.)

PHOTO CAPTION:  As synthetic media spreads, even real images will invite skepticism."


GIST: (It's a very lengthy article - well worth the read, word by word. Here is but a taste. HL);
"In the emerging world of “synthetic media,” the work of digital-image creation—once the domain of highly skilled programmers and Hollywood special-effects artists—could be automated by expert systems capable of producing realism on a vast scale. In a media environment saturated with fake news, such technology has disturbing implications. Last fall, an anonymous Redditor with the username Deepfakes released a software tool kit that allows anyone to make synthetic videos in which a neural network substitutes one person’s face for another’s, while keeping their expressions consistent. Along with the kit, the user posted pornographic videos, now known as “deepfakes,” that appear to feature various Hollywood actresses. (The software is complex but comprehensible: “Let’s say for example we’re perving on some innocent girl named Jessica,” one tutorial reads. “The folders you create would be: ‘jessica; jessica_faces; porn; porn_faces; model; output.’ ”) Around the same time, “Synthesizing Obama,” a paper published by a research group at the University of Washington, showed that a neural network could create believable videos in which the former President appeared to be saying words that were really spoken by someone else. In a video voiced by Jordan Peele, Obama seems to say that “President Trump is a total and complete dipshit,” and warns that “how we move forward in the age of information” will determine “whether we become some kind of fucked-up dystopia.” Not all synthetic media is dystopian. Recent top-grossing movies (“Black Panther,” “Jurassic World”) are saturated with synthesized images that, not long ago, would have been dramatically harder to produce; audiences were delighted by “Star Wars: The Last Jedi” and “Blade Runner 2049,” which featured synthetic versions of Carrie Fisher and Sean Young, respectively. Today’s smartphones digitally manipulate even ordinary snapshots, often using neural networks: the iPhone’s “portrait mode” simulates what a photograph would have looked like if it been taken by a more expensive camera."....................................................................................

"In 2016, the Defense Advanced Research Projects Agency (DARPA) launched a program in Media Forensics, or MediFor, focussed on the threat that synthetic media poses to national security. Matt Turek, the program’s manager, ticked off possible manipulations when we spoke: “Objects that are cut and pasted into images. The removal of objects from a scene. Faces that might be swapped. Audio that is inconsistent with the video. Images that appear to be taken at a certain time and place but weren’t.” He went on, “What I think we’ll see, in a couple of years, is the synthesis of events that didn’t happen. Multiple images and videos taken from different perspectives will be constructed in such a way that they look like they come from different cameras. It could be something nation-state driven, trying to sway political or military action. It could come from a small, low-resource group. Potentially, it could come from an individual." .................................................

"We took a small table by the window. “What’s really interesting about these technologies is how quickly they went from ‘Whoa, this is really cool’ to ‘Holy crap, this is subverting democracy,’ ” Farid said, over a seaweed salad. “I think it’s video,” Efros said. “When it was images, nobody cared.” “Trump is part of the equation, too, right?” Farid asked. “He’s creating an atmosphere where you shouldn’t believe what you read.” “But Putin—my dear Putin!—his relationship with truth is amazing,” Efros said. “Oliver Stone did a documentary with him, and Putin showed Stone a video of Russian troops attacking ISIS in Syria. Later, it turned out to be footage of Americans in Iraq.” He grimaced, reaching for some sushi. “A lot of it is not faking data—it’s misattribution. On Russian TV, they say, ‘Look, the Ukrainians are bombing Donetsk,’ but actually it’s footage from somewhere else. The pictures are fine. It’s the label that’s wrong.” Over dinner, Farid and Efros debated the deep roots of the fake-news phenomenon. “A huge part of the solution is dealing with perverse incentives on social media,” Farid said. “The entire business model of these trillion-dollar companies is attention engineering. It’s poison.”"

-----------------------------------------------------------------



The entire story can be read at:  

https://www.newyorker.com/magazine/2018/11/12/in-the-age-of-ai-is-seeing-still-believing?mbid=nl_Magazine%20Daily%20List%20110518&CNDID=23031411&utm_source=nl&utm_medium=email&utm_campaign=Magazine%20Daily%20List%20110518&utm_content=&utm_brand=tny&utm_mailing=Magazine%20Daily%20List%20110518&hasha=b06ce11218fae36a81180c431cf6e0da&hashb=7240b17716e82766688aeefaa0ef8683fcd869fb&spMailingID=14565152&spUserID=MTMzMTc5ODYwNjU0S0&spJobID=1520378958&spReportId=MTUyMDM3ODk1OAS2

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/charlessmith. Information on "The Charles Smith Blog Award"- and its nomination process - can be found at: http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html Please send any comments or information on other cases and issues of interest to the readers of this blog to: hlevy15@gmail.com.  Harold Levy: Publisher; The Charles Smith Blog;