
PASSAGE ONE OF THE DAY: "As they proliferate, police body cameras have courted controversy because of the contentious nature of the footage they capture and questions about how accessible those recordings should be. But when it comes to the devices themselves, the most crucial function they need to perform—beyond recording footage in the first place—is protecting the integrity of that footage so it can be trusted as a record of events. At the DefCon security conference in Las Vegas on Saturday, though, one researcher will present findings that many body cameras on the market today are vulnerable to remote digital attacks, including some that could result in the manipulation of footage."
---------------------------------------------------------------
PASSAGE TWO OF THE DAY: "These videos can be as powerful as something like DNA evidence, but if they’re not properly protected there’s the potential that the footage could be modified or replaced," Mitchell says. "I can connect to the cameras, log in, view media, modify media, make changes to the file structures. Those are big issues." Mitchell even realized that because he can remotely access device storage on models like the Fire Cam OnCall, an attacker could potentially plant malware on some of the cameras. Then, when the camera connects to a PC for syncing, it could deliver all sorts of malicious code: a Windows exploit that could ultimately allow an attacker to gain remote access to the police network, ransomware to spread across the network and lock everything down, a worm that infiltrates the department's evidence servers and deletes everything, or even cryptojacking software to mine cryptocurrency using police computing resources. Even a body camera with no Wi-Fi connection, like the CeeSc, can be compromised if a hacker gets physical access. "You know not to trust thumb drives, but these things have the same ability," Mitchell says. "The fact that some law enforcement evidence-collecting devices can be hacked evokes some true nightmare scenarios," says Jay Stanley, senior policy analyst at the American Civil Liberties Union. "If there aren't reliable ways of ensuring that such equipment meets strong security standards, then something is deeply broken. No police equipment should be deployed that doesn't meet such standards."
--------------------------------------------------------------------
QUOTE OF THE DAY: "The fact that some law enforcement evidence-collecting devices can be hacked evokes some true nightmare scenarios," says Jay Stanley, senior policy analyst at the American Civil Liberties Union. "If there aren't reliable ways of ensuring that such equipment meets strong security standards, then something is deeply broken. No police equipment should be deployed that doesn't meet such standards."
---------------------------------------------------------------------
PUBLISHER'S NOTE: Artificial intelligence, once the stuff of science fiction, has become all to real in our modern society - especially in the American criminal justice system; As the ACLU's Lee Rowland puts it: "Today, artificial intelligence. It's everywhere — in our homes, in our cars, our offices, and of course online. So maybe it should come as no surprise that government decisions are also being outsourced to computer code. In one Pennsylvania county, for example, child and family services uses digital tools to assess the likelihood that a child is at risk of abuse. Los Angeles contracts with the data giant Palantir to engage in predictive policing, in which algorithms identify residents who might commit future crimes. Local police departments are buying Amazon's facial recognition tool, which can automatically identify people as they go about their lives in public." The algorithm is finding its place deeper and deeper in the nation's courtrooms on what used to be exclusive decisions of judges such as bail and even the sentence to be imposed. I am pleased to see that a dialogue has begun on the effect that increasing use of these logarithms in our criminal justice systems is having on our society and on the quality of decision-making inside courtrooms. As Lee Rowland asks about this brave new world, "What does all this mean for our civil liberties and how do we exercise oversight of an algorithm?" In view of the importance of these issues - and the increasing use of artificial intelligence by countries for surveillance of their citizens - it's time for yet another technology series on The Charles Smith Blog focusing on the impact of science on society and criminal justice. Up to now I have been identifying the appearance of these technologies. Now at last I can report on the realization that some of them may be two-edged swords - and on growing pushback.
Harold Levy: Publisher; The Charles Smith Blog:
------------------------------------------------------------
STORY: "Police Bodycams Can Be Hacked to Doctor Footage," by Lily Hay Newman, published by "Wired' on August 11, 2018. (Lily Hay Newman is Wired's staff security writer.)
GIST: "As they proliferate, police body cameras have courted controversy because of the contentious nature of the footage they capture and questions about how accessible those recordings should be. But
 when it comes to the devices themselves, the most crucial function they
 need to perform—beyond recording footage in the first place—is 
protecting the integrity of that footage so it can be trusted as a 
record of events. At the DefCon security conference in Las Vegas on 
Saturday, though, one researcher will present findings that many body 
cameras on the market today are vulnerable to remote digital attacks, 
including some that could result in the manipulation of footage. Josh
 Mitchell, a consultant at the security firm Nuix, analyzed five body 
camera models from five different companies: Vievu, Patrol Eyes, Fire 
Cam, Digital Ally, and CeeSc. The companies all market their devices to 
law enforcement groups around the US. Mitchell's presentation does not 
include market leader Axon—although the company did acquire Vievu in 
May. In all but the Digital Ally device, the 
vulnerabilities would allow an attacker to download footage off a 
camera, edit things out or potentially make more intricate 
modifications, and then re-upload it, leaving no indication of the 
change. Or an attacker could simply delete footage they don't want law 
enforcement to have. Mitchell
 found that all of the devices he tested had security issues that could 
allow an attacker to track their location or manipulate the software 
they run. He also found problems with the ecosystem of mobile apps, 
desktop software, and cloud platforms that these cameras interact with. 
Additionally, Mitchell says that some of the more sophisticated models, 
which contain radios for Bluetooth or cellular data connectivity, also 
have vulnerabilities that can be exploited to remotely stream live 
footage off the cameras, or to modify, add, and delete the footage 
stored on the devices. "With some of these 
vulnerabilities—it’s just appalling," Mitchell says. "I approached this 
research by trying to find industry trends that are prevalent across 
multiple devices. There are issues for each of the five devices I looked
 at that are specific to that device, but there are also trends in 
general across all of them. They are missing many modern mitigations and
 defenses." Four of the five body cameras Mitchell
 tested have a Wi-Fi radio—the CeeSc WV-8 does not—and all of those 
broadcast identifying information about the device. Sensitive gadgets 
like smartphones have started randomizing these IDs, known as MAC 
addresses, to mask them. But the body cameras Mitchell looked at use 
predictable formats that give away too much information, like make and 
model plus a code for each device. That means 
attacker could use a long range antenna to track cops. And as Mitchell 
points out, body cameras are often only activated when police carry out 
certain operations, or anticipate particular interactions. Noticing that
 10 body cameras all activated at once, in a localized area, could 
foreshadow a raid, for instance. Mitchell fears that the exposure could 
pose a safety risk to law enforcement. Mitchell
 says that all of the devices also have shortcomings in validating the 
code they run and the data they store. He found that none of the models 
he tested uses cryptographic signing to confirm the integrity of 
firmware updates, a common Internet of Things lapse. Without it, an 
attacker might develop malicious software that could be delivered to 
different devices in different ways based on their other 
vulnerabilities—through exposed desktop software or remote programming, 
for example. Once introduced, the devices will run any firmware without 
question. More specifically problematic: The 
bodycams don't have a cryptographic mechanism to confirm the validity of
 the video files they record either. As a result, when the devices sync 
with a cloud server or station PC, there's no way to guarantee that the 
footage coming off the camera is intact. "I haven’t seen a single video 
file that’s digitally signed," Mitchell says. In 
addition to connecting to Wi-Fi networks, higher-end body cameras like 
the Vievu LE-5 Lite and the Patrol Eyes SC-DV10 also have the ability to
 generate a Wi-Fi access point of their own. That allows other devices 
to connect to the camera's private network, but Mitchell found that 
these features had inadequate or missing authentication in the models he
 tested, so anyone could connect to a camera from a regular consumer 
device and access its data. Mitchell says that the
 cameras all had some features that were missing key access controls, or
 relied on default credentials that were easy to determine. A proactive 
police department could update the defaults to something stronger, but 
even those could be undermined on certain devices. Many of the desktop 
platforms and mobile apps used with the cameras also had issues with 
access control.
The entire story can be read at:
"These videos can be as powerful 
as something like DNA evidence, but if they’re not properly protected 
there’s the potential that the footage could be modified or replaced," 
Mitchell says. "I can connect to the cameras, log in, view media, modify
 media, make changes to the file structures. Those are big issues." Mitchell
 even realized that because he can remotely access device storage on 
models like the Fire Cam OnCall, an attacker could potentially plant 
malware on some of the cameras. Then, when the camera connects to a PC 
for syncing, it could deliver all sorts of malicious code: a Windows 
exploit that could ultimately allow an attacker to gain remote access to
 the police network, ransomware to spread across the network and lock 
everything down, a worm that infiltrates the department's evidence 
servers and deletes everything, or even cryptojacking software to mine 
cryptocurrency using police computing resources. Even a body camera with
 no Wi-Fi connection, like the CeeSc, can be compromised if a hacker 
gets physical access. "You know not to trust thumb drives, but these 
things have the same ability," Mitchell says. "The
 fact that some law enforcement evidence-collecting devices can be 
hacked evokes some true nightmare scenarios," says Jay Stanley, senior 
policy analyst at the American Civil Liberties Union. "If there aren't 
reliable ways of ensuring that such equipment meets strong security 
standards, then something is deeply broken. No police equipment should 
be deployed that doesn't meet such standards." Mitchell
 disclosed his findings to the five vendors and has been working with 
them to fix the issues. Axon says it is in the process of patching the 
Vievu vulnerabilities. "We are pushing a fix out to all Vievu customers 
early next week to resolve the issue that impacts users who have not 
reset their default Wi-Fi password," Axon spokesperson Steve Tuttle told
 WIRED. "As part of our regular release cycle, we are pushing several 
security updates next quarter, which include items identified by the 
security researcher. We have invested heavily in a dedicated information
 security team that works to ensure all Axon products are designed and 
built with security in mind." A Patrol Eyes 
spokesperson told WIRED that the company is aware of Mitchell's findings
 and is evaluating them. Fire Cam president Rob Schield says the company
 discontinued the OnCall device two years ago and no longer supports it.
 Third parties continue to sell it, though. CeeSc, which is owned by 
Chinese manufacturer Advanced Plus Group, did not return WIRED's 
requests for comment. Digital Ally also did not respond to inquiries. Mitchell
 hopes that the companies fix the bugs he found, but his larger goal is 
to call attention to the shortcomings of a whole class of device—one 
that happens to play a vital role in public safety and social justice. 
"It's a complex ecosystem and there are a lot of devices out there with a
 lot of problems," Mitchell says. "These are full-feature computers 
walking around on your chest, and they have all of the issues that go 
along with that."
The entire story can be read at:

PUBLISHER'S NOTE: I am monitoring this case/issue. Keep your eye on the Charles Smith Blog for reports on developments. The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at: http://www.thestar.com/topic/
Harold Levy: Publisher; The Charles Smith Blog;
---------------------------------------------------------------------
