Thursday, October 13, 2011

AMANDA KNOX: ARS TECHNICA SCIENCE EDITOR JOHN TIMMER EXPLAINS HOW WEAK DNA EVIDENCE RAILROADED - AND THEN RESCUED AMANDA KNOX;


"But Kobilinsky said that DNA only tells part of the story. "We don't know when the DNA got deposited on the substrate," he said, "and we don't know how it got deposited, either through direct or indirect contact." In other words, interpretation and context matter. The lack of a larger picture proved especially problematic in the Knox case, where it wasn't even clear whether the knife from which DNA was obtained served as the murder weapon.

None of this is to say that a well-handled, high-signal piece of DNA evidence can't be decisive. But in the end, Kobilinsky said, that evidence works best when it's part of a larger picture and not the sole factor linking a suspect to a crime."

JOHN TIMMER: SCIENCE EDITOR AND OBSERVATORY MONITOR; ARS TECHNICA;

Ars Technica informs us that: "John got a Bachelor of Arts in Biochemistry (yes, that's possible) from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. He's done over a decade's worth of research in genetics and developmental biology at places like Cornell Medical College and the Memorial Sloan-Kettering Cancer Center. In addition to being Ars' science content wrangler, John still teaches at Cornell and does freelance writing, editing, and programming, often with a scientific focus..."

WIKIPEDIA informs us that Ars Rechnica is: "a technology news and information website created by Ken Fisher and Jon Stokes in 1998. It publishes news, reviews and guides on issues such as computer hardware and software, science, technology policy, and video games. Ars Technica is known for its features, long articles that go into specific detail on their subjects. Many of the site's writers are postgraduates, and some work for research institutions. Articles on the website are often written in an opinionated tone, as opposed to a journal."

------------------------------------------------------------

"If you watch crime dramas, you'll be forgiven for the impression that DNA evidence makes an airtight case," the ArsTechnica article by John Timmer published on Ocotber 12, 2011 under the heading, "How weak DNA evidence railroaded - and then rescued Amanda Knox," begins.

"And if you do have that impression, you might be confused about the internationally famous case of American Amanda Knox, convicted of murdering her British roommate in Perugia, Italy in 2007," the article continues.

"After all, the prosecution's case was based on DNA evidence; Knox's genetic fingerprints were found by Italian police on the handle of a kitchen knife, which also had the victim's DNA on the blade.

But not all DNA evidence is created equal—and Knox walked free last week from an Italian jail after scientists savaged the forensic evidence against her as being wholly unreliable. How did DNA analysis go so wrong?

To understand the problems with the Knox case, we drew on the extensive real-world genetics experience of the Ars science staff and spoke with Dr. Lawrence Kobilinsky of the John Jay College of Criminal Justice in New York. Kobilinski has seen the DNA test results from the Knox case and helped walk us through the reasons that DNA evidence isn't always as airtight as it sometimes looks on TV.

DNA analysis amplifies a tiny bit of DNA into millions of copies, but this amplification process can lead to problems if it's not carefully managed. The results of this process don't speak for themselves—interpretation is always required—and the interpretation of DNA analysis became a decisive problem for Amanda Knox. In the end, terrible crime scene management and an unjustified certainty about DNA evidence on the supposed murder weapon led to a murder conviction that collapsed on appeal.

The Knox case

Amanda Knox was a 20-year old American citizen living in Perugia, Italy, sharing an apartment with several other women. One of them, Briton Meredith Kercher, was murdered on November 1, 2007, her body discovered nude inside her locked bedroom, with a fatal knife wound to the neck. Knox claimed to have spent the night with her boyfriend in a different building and only returned in time to help discover Kercher's body.

Although Perugia resident Rudy Guede was charged with rape and murder, Knox and her boyfriend, Raffaele Sollecito, were eventually charged in the case as well. A witness claimed that the pair had been near the apartment the night of the murder, and some DNA evidence (on a knife belonging to Sollecito and on Kercher's bra) allegedly linked them to the crime. Amidst a swarm of media attention, Knox and her boyfriend were eventually convicted of murder.

Then came the appeal. The witness who had allegedly seen the duo turned out to be a heroin addict who gave inconsistent accounts. That shifted the focus away from witness testimony and onto the DNA evidence, which was finally evaluated by two experts from the Universita di Roma.

The experts were not kind to the evidence. The bra clasp, it turned out, had sat on the floor for more than six weeks after the murder before being secured and processed; photographs show that it had been moved between the murder and its eventual collection. The clasp was the only DNA evidence placing Sollecito at the scene of the crime; no DNA put Knox on the scene at all.

The supposed murder weapon, a long kitchen knife, was found in the home of Sollecito, in his kitchen knife drawer. The knife held little DNA and, according to the experts, the local authorities had not handled the tests properly to compensate.

In short, there were problems with all the DNA evidence used in the trial. Without a witness or reliable DNA evidence, Knox's conviction was overturned on October 3, and she was freed, returning immediately to the US.

Obtaining DNA evidence

To understand what went wrong with the DNA evidence here, we need to look at the techniques that help generate that evidence. (The discussion gets a bit technical, but it's important to understand the reasons why this evidence has been rejected.)

The modern use of forensic DNA relies on a technique called the polymerase chain reaction (PCR), which won inventor Kary Mullis half of the 1993 Nobel Prize for chemistry. PCR repeatedly amplifies specific pieces of DNA. Scientists begin by designing two short pieces of DNA called "primers" that flank a particular genetic sequence of interest. These primers then enable a protein called a polymerase to copy the intervening DNA sequence, creating two identical copies from a single source. A cycle of temperature changes can reset the system, and each cycle doubles the number of identical molecules present. The result: rapid, exponential copying of a single DNA molecule. (To learn more, read our previous in-depth account of PCR.)

The PCR cycle allows primers to trigger the amplification of the DNA sequence that they flank.

This exponential growth theoretically allows a single molecule of DNA to be amplified into an entire population of identical molecules, making it trivial to detect. In practice, Kobilinsky said that PCR has allowed definitive identification of the source of DNA samples from less than 100 picograms (10-12 of a gram) of DNA. (That's the weight of about 100 bacteria.)

This extreme sensitivity, however, creates its own problems. "You have to be extra careful not to contaminate the sample or equipment," Kobilinsky said, since just a tiny bit of contaminating DNA is enough to generate a false positive from a sample that otherwise lacks the relevant DNA sequence. That was a danger here: the DNA from the bra clasp, ultimately used to place Sollecito (and by induction, Knox) at the scene, sat around for weeks in an apartment that Knox had occupied and Sollecito visited.

PCR also has a propensity to generate artifacts. Although the primers are highly specific to a given DNA sequence, there's a large population of primers in every reaction. This raises the likelihood of a rare event like the amplification of a mismatched DNA sequence. If something odd happens early enough in the amplification process, it's even possible for an artifact to become the primary product of a PCR reaction, causing confusing results.

A typical thermocycler, which automates the heating and cooling of samples (samples held at top).

The more times you cycle a reaction, the more likely you are to amplify something spurious. Kobilinsky laid out strict rules for how many cycles are performed in a forensic PCR reaction: 28 cycles under standard conditions, and 31 cycles for "high sensitivity" tests, used when the available quantities of DNA are very small.

There are ways to control for many of these problems—doing reactions without any of the DNA sample in order to test for contamination, using known positive samples, etc. All of these increase the reliability of the evidence by identifying the testing that can't be trusted. But these controls emphasize the point: DNA evidence alone isn't as decisive as it's often perceived to be. And other problems came into play when the knife was tested.

Detecting and interpreting DNA

PCR allows us to take tiny samples of DNA and amplify specific sequences until there's enough material to work with. But how do we associate those with specific individuals? By matching as many small sequences as possible.

Many areas in the human genome (as well as in other organisms) contain a set of short repeated sequences. For example, the sequence called D8S1179 simply repeats the DNA bases TCTA. What makes this repeated sequence useful for identification is that the number of repeats varies by individuals, ranging from a low of seven to a high of 20. (In other words, the sequence can be as short as 28 base pairs or as long as 80 base pairs.)

We can design primers that flank things like the D8S1179 sequence. When the PCR reaction runs, it is likely to produce two different products, since a person's two chromosome sets (one from mom, one from dad) can each carry a different number of repeats. For the same reason, one person's DNA analysis is unlikely to match another's. The probability of a chance match (that is, a mistake) over any single sequence is too high for confident identification—say, one in 250—but as you add more and more of these sequences, the probability of a chance match grows remote.

There are some caveats here—rare variants in some ethnic groups may be quite common in others, for instance. But with enough of these markers, it's possible to make definitive identifications using DNA.

The various PCR marker segments are therefore essential to an identification. Fortunately, there's a relatively simple way of separating the sequences: we tag them. Each one of the primer molecules comes tagged with a fluorescent chemical. Five distinct colors are commonly available, allowing a single reaction to contain five sets of primers that each amplify a distinct sequence. Even a tiny DNA sample can be used to test for five different genetic markers.

Separating the amplified segments by size is also relatively easy. In solution, DNA has a negative charge and will move towards a positive electrode. Putting a gel between the DNA and that electrode will slow the DNA down, with larger molecules slowed more than smaller ones. Do this with a long enough gel, and each distinct population of repeat sequence will produce a distinct band or peak within the gel. At that point, all that's left is to read the bands and see if they match up to another sample.

Reading a gel

Running the gel and reading the fluorescent intensity of the DNA molecules is done by automated systems supplied by commercial vendors. Each machine goes through a standardized validation process that helps the people running it understand how well it distinguishes signal from noise. Noise can result from a variety of things: leftover fluorescent molecules, stray photons in the light sensor, etc. It's possible to assign a value, called a Relative Fluorescence Unit (RFU), to each point on a gel. The RFU represents the difference between the actual signal on a given part of the gel and the typical background signal. "It's the height of a peak [of signal]," Kobilinsky said.

An electropherogram of PCRs, showing the two different repeat lengths present in most human samples. Note that the peaks vary in intensity, and that small peaks that aren't the product of actual genotypes are also present. These are a mixture of contamination and background noise.

The validation process helps identify how many RFUs are needed before a signal is considered sufficiently distinct from the background to represent PCR-amplified DNA rather than noise. For the current generation of machines, that's around 50 RFUs; older hardware was typically above 75 RFUs, and the FBI, which Kobilinsky called "very conservative," required values over 120 on some of the older machines.

It's important to note that these standards are the consensus view of the forensics community, but it's still possible to get a nice, clean-looking peak that stands out from the background noise without reaching 50 RFUs. Typically, that would represent a real amplification of DNA that just didn't work well enough; if you did it again, chances are good you'd have a positive signal. The chances of an error—some combination of unusually high background or a spurious amplification—however, are considered too high for such sub-50 RFU results to be considered evidence in the courtroom.

In a US courtroom, that is.

DNA in the real world

And it was precisely these kinds of uncertainties that the expert report, prepared for Knox's appeal, focused on. In the absence of a reliable witness placing her at the crime scene, and with no obvious motive, only the DNA evidence linked Knox to the crime. According to the expert report, the samples used had either a high risk of contamination (the bra) or very low signal (the knife). For the knife samples, the peaks reached RFU levels as low as 15 and 21, with the stronger readings only hitting 41.

Kobilinsky had the chance to see the results of the DNA testing, and he agreed that, while there were peaks present, they fell well short of the 50 RFUs that serve as the standard of evidence in the US court system. "In this country, they wouldn't call them real genes," Kobilinsky said.

(Note that he's using a fairly broad definition of "gene." The repeat sequences here are inherited just like any regular gene, but they don't typically encode a protein or functional RNA.)

These results might have represented real signals, but the only way to tell would be to repeat the PCR reaction. The DNA obtained from the knife, however, was present in such small amounts that all of it went into the initial reactions; nothing was left to retest. It wasn't standard practice to perform "high sensitivity" testing in Italy, either.

In the US, the issues with DNA testing described above are now generally understood by prosecutors and defense lawyers alike. Any problems with contamination or poorly-controlled work would be called out in the courtroom by any well-prepared attorney. Still, US juries do suffer a bit from what Kobilinsky called the "CSI effect"—they expect most cases to have some form of scientifically validated evidence, and they give deference to DNA evidence.

But Kobilinsky said that DNA only tells part of the story. "We don't know when the DNA got deposited on the substrate," he said, "and we don't know how it got deposited, either through direct or indirect contact." In other words, interpretation and context matter. The lack of a larger picture proved especially problematic in the Knox case, where it wasn't even clear whether the knife from which DNA was obtained served as the murder weapon.

None of this is to say that a well-handled, high-signal piece of DNA evidence can't be decisive. But in the end, Kobilinsky said, that evidence works best when it's part of a larger picture and not the sole factor linking a suspect to a crime.

"It's an important piece of evidence," he said, "but a verdict should be based on the sum of evidence.""

The article can be found at:

http://arstechnica.com/science/news/2011/10/how-weak-dna-evidence-railroadedand-then-rescued-amanda-knox.ars?utm_source=rss&utm_medium=rss&utm_campaign=rss

PUBLISHER'S NOTE: The Toronto Star, my previous employer for more than twenty incredible years, has put considerable effort into exposing the harm caused by Dr. Charles Smith and his protectors - and into pushing for reform of Ontario's forensic pediatric pathology system. The Star has a "topic" section which focuses on recent stories related to Dr. Charles Smith. It can be found at:

http://www.thestar.com/topic/charlessmith

Information on "The Charles Smith Blog Award"- and its nomination process - can be found at:

http://smithforensic.blogspot.com/2011/05/charles-smith-blog-award-nominations.html

Harold Levy: Publisher; The Charles Smith Blog; hlevy15@gmail.com;