The Claremont Killer

I found an old blog post of mine which mentioned both the Leiterman case and Profile N in New Zealand.

I'll comment on this part:

The main issue in this case is a disagreement among experts over the correct statistical calculation to use in cold hit cases. Another key element in this case is that it made use of a partial, not full, DNA profile. In the United States a full DNA profile has thirteen separate markers (loci). A profile must have seven markers for it to be searched in California’s database. However, only 5 and a half markers were clearly found in the Sylvester murder. The jury was told that the chances of a random person’s matching the DNA found at the crime scene were 1.1 million to one. Yet when the odds were calculated with a different set of statistical assumptions the odds were only one in three, a statistic that the jury was barred from hearing. This is why the odds were only 1.1 million to one, and not substantially higher. However, when one uses a model that takes into account that there were 338,000 profiles in the database, one arrives at the 1-in-3 odds. The question of which model is better is a difficult one, yet it is odd that California courts have taken it upon themselves to decide which model is more appropriate. And the difference between one in 1.1 million and 1 in 3 is huge. It would have probably been enough to move at least one juror from guilty to not guilty beyond a reasonable doubt.

The way you calculate odds of a random person matching is by looking at the frequency of each locus in a given population. Each locus is a simple number, ranging from something like 4 all the way up to 17 and sometimes more. This is the number of short, four nucleotide repeats in non-coding DNA that appears in specific places, you count the number of repeats the person has on that alele.

Anyway, if you have five loci to work with, you'll have something like:
D3S1358 14, 17
vWA 14,14 (homozygote)
D16S539 10,11
D2S1338 19,25
D8S1179 14,15

This is a realistic example, taken from a kit that would be used in 2002, page 94:
https://assets.thermofisher.com/TFS-Assets/LSG/manuals/cms_041049.pdf

You then look up for frequency of each alele, either in racial information or take the whole thing, and come up with a number that 1 in 1.1 million people would have a profile like that. This is the approximate odds of this sample above that I made up.

So the match is not 1:3, because the database is 380,000 people strong. That's just the odds of having one person with that DNA profile in the database. The DNA could still be deposited there by only one person in an entire city worth of people. If you had a database of 33 million people you'd expect about 30 people to be possible donors and if you did so the police could check those people one by one to exclude them as suspects. Most would have rock-solid alibis (innocent people usually do) and the rest could be excluded by other basic measures, such as not matching the description or whatever other evidence is used to convict. DNA evidence alone is useless.

McHrozni
 
Last edited:
random match probability versus source probability

My lone paragraph about the Puckett case could not do justice to the interest in it that has been shown by academics. I will try to return to it over the next few weeks, because I think that some of the issues in the Pockett case illuminate the present one. For example, I was not questioning the calculation of the random match probability (RMP) in the former; instead, I was implying that giving only one statistic to the jury, particularly without context, might have been misleading.

Recently I found an article that dealt with both the Puckett case and which also addressed an issue that applies to a number of cases. Among other things, it distinguishes the RMP from the source probability. The article is "Safety in numbers-Deciding when DNA alone is enough to convict," by Andrea Roth link. Here is a paragraph that may whet one's appetite for the whole article:

"In sum, there appear to be three critical shortcomings in how courts-both in the United States and in the United Kingdom-have dealt with the issue of legal sufficiency in pure cold hit cases. First, while courts appear to understand that the sufficiency of a DNA pro- file match is a function of how high the source probability is, courts do not understand how to calculate the source probability, typically con- flating it with the RMP and failing to consider the size of the suspect population. Second, although courts appear to have found some match statistics sufficient and others insufficient, no court has explained how it made this determination-that is, whether there is some numerical threshold that the source probability must meet to render the DNA match alone legally sufficient evidence of guilt. Finally, neither courts nor litigants appear to have explored the more fundamental question of whether numerical evidence alone, even a very high source probability, should ever be deemed sufficient evidence of guilt by itself."
 
Last edited:
My lone paragraph about the Puckett case could not do justice to the interest in it that has been shown by academics. I will try to return to it over the next few weeks, because I think that some of the issues in the Pockett case illuminate the present one. For example, I was not questioning the calculation of the random match probability (RMP) in the former; instead, I was implying that giving only one statistic to the jury, particularly without context, might have been misleading.

Recently I found an article that dealt with both the Puckett case and which also addressed an issue that applies to a number of cases. Among other things, it distinguishes the RMP from the source probability. The article is "Safety in numbers-Deciding when DNA alone is enough to convict," by Andrea Roth link. Here is a paragraph that may whet one's appetite for the whole article:

"In sum, there appear to be three critical shortcomings in how courts-both in the United States and in the United Kingdom-have dealt with the issue of legal sufficiency in pure cold hit cases. First, while courts appear to understand that the sufficiency of a DNA pro- file match is a function of how high the source probability is, courts do not understand how to calculate the source probability, typically con- flating it with the RMP and failing to consider the size of the suspect population. Second, although courts appear to have found some match statistics sufficient and others insufficient, no court has explained how it made this determination-that is, whether there is some numerical threshold that the source probability must meet to render the DNA match alone legally sufficient evidence of guilt. Finally, neither courts nor litigants appear to have explored the more fundamental question of whether numerical evidence alone, even a very high source probability, should ever be deemed sufficient evidence of guilt by itself."

Ooh, good one. I can answer why no universal standard for a DNA match was ever made: it's impossible to make one.

Suppose you have 11 people stranded on an island, 10 males and one female. Just before the rescue, the female is raped but can't identify which of the 10 males was it. A rape kit is done and you get a partial DNA profile, with only one locus.

The male profile is a very common variant, shared by over 30% of the population, so the match is a measly 1 in 3. Yet, by coincidence, none of the other 9 males nor the female has that variant on the locus and could be the donor of the DNA in her vagina.

The DNA sample in this case is good enough to be the key bit of evidence (alongside corroborating pieces) to secure a conviction. Yet the match is a mere 1:3, which is basically useless in a less extreme scenario.

Most cases are like that. Yes, the match is a mere 1 in 100 million. But there were only about 500 thousand people who could possibly be anywhere near the crime scene, the rest of the world's population were all too far away to drive there and back and not be noticed. That means the odds are none of them could've been the donor of the DNA.

You can't get a universal, transparent standard of when the match is "good enough" that would be better than an expert opinion on each individual case. If there are many possible donors you need a far better match, especially if they're related to one another or even inbred. Gypsies can be a major problem for DNA profiling for precisely that reason.

McHrozni
 
Last edited:
My lone paragraph about the Puckett case could not do justice to the interest in it that has been shown by academics. I will try to return to it over the next few weeks, because I think that some of the issues in the Pockett case illuminate the present one. For example, I was not questioning the calculation of the random match probability (RMP) in the former; instead, I was implying that giving only one statistic to the jury, particularly without context, might have been misleading.

Recently I found an article that dealt with both the Puckett case and which also addressed an issue that applies to a number of cases. Among other things, it distinguishes the RMP from the source probability. The article is "Safety in numbers-Deciding when DNA alone is enough to convict," by Andrea Roth link. Here is a paragraph that may whet one's appetite for the whole article:

"In sum, there appear to be three critical shortcomings in how courts-both in the United States and in the United Kingdom-have dealt with the issue of legal sufficiency in pure cold hit cases. First, while courts appear to understand that the sufficiency of a DNA pro- file match is a function of how high the source probability is, courts do not understand how to calculate the source probability, typically con- flating it with the RMP and failing to consider the size of the suspect population. Second, although courts appear to have found some match statistics sufficient and others insufficient, no court has explained how it made this determination-that is, whether there is some numerical threshold that the source probability must meet to render the DNA match alone legally sufficient evidence of guilt. Finally, neither courts nor litigants appear to have explored the more fundamental question of whether numerical evidence alone, even a very high source probability, should ever be deemed sufficient evidence of guilt by itself."


If I understand this correctly, the conflation of RMP with source probability described in the article as due to the 'fallacy of the transposed conditional' is the same as 'confusion of the inverse' where the probability of a match, given innocence, is conflated with the probability of innocence, given a match. I find this interesting as it seems to be a ubiquitous error (the same as conflating the probability of getting an outcome when the null is true with the probability that the null is true that seems to explain so much confusion over p values). I'm not sure why the RMP would be given to a jury at all given that this is likely to be misleading. I could see that a likelihood ratio alone is somewhat imprecise given that the 'size of the suspect population' is not really the actual number of potential suspects in reality (since some will in reality not be plausible suspects), but the LR is easier to understand and less likely to bias perception than a tiny RMP.

I am personally very interested in this issue of how probability is understood by juries. I have seen some interesting research on the effect of using different ways of presenting probabilistic evidence. It's probably getting away from the Claremont case since there is is no jury and there is other evidence. In cases where is nothing but the source probability of DNA evidence it seems the main difficulty is having no reliable way to estimate the possible impact of other unknown factors on probability of guilt. In the Claremont case, the difficulty would be how much weight to give the other factors.
 
I'll comment on this part:



So the match is not 1:3, because the database is 380,000 people strong. That's just the odds of having one person with that DNA profile in the database. The DNA could still be deposited there by only one person in an entire city worth of people. If you had a database of 33 million people you'd expect about 30 people to be possible donors and if you did so the police could check those people one by one to exclude them as suspects. Most would have rock-solid alibis (innocent people usually do) and the rest could be excluded by other basic measures, such as not matching the description or whatever other evidence is used to convict. DNA evidence alone is useless.

McHrozni

I'm assuming the 1:3 was referring to the distinction between RMP match and database match (where the sample is compared against every entry in the database, so any chance match is bound to be detected). Although obviously that still does not provide probability of guilt which is also based on other factors that affect likelihood that the culprit will or won't be in the database.

You have mentioned DNA evidence alone being useless in the context of excluding suspects. I think the question is what happens when somebody has no alibi and can't be excluded, but the DNA is the only evidence against them (or other evidence is weak and only discovered as a result of the DNA match). It may not happen often but has happened in some of the cases discussed.
 
Some background information on negative controls

I have collected a few quotes to explain what negative controls are, although more could be said on this subject. All three articles are worth reading in their entirety.

“Identifying and Preventing DNA Contamination in a DNA-Typing Laboratory,” by Terri Sundquist and Joseph Bessetti (Promega Corporation) link
“Appropriate control reactions are helpful in determining whether DNA contamination has occurred. A “reagent blank” control consists of all reagents used during sample processing but contains no sample. This control is used to detect DNA contamination of the analytical reagents used to prepare the sample for analysis. In a separate negative control reaction, water is used instead of extracted sample or reagent blank. This negative control reaction is often referred to as the “no-template” control and allows identification of contamination in the amplification reagents themselves.”

“DNA Testing: An Introduction for Non-Scientists,” by Dr. Donald E. Riley link
“Good PCR technique is no guarantee that contamination didn't influence the results. Steps must be taken to try and detect contamination. Negative controls are blank PCRs that have all the components of the evidentiary PCRs but have no other DNA added intentionally. Fortunately, there are often two negative controls used, one when the DNA is extracted, and another when the PCR is set up. Any PCR signal in the negative control would warn that contamination has occurred. Unfortunately, the negative controls are virtually the only warning of PCR contamination. Negative controls may alert the analyst to general contamination occurring within the lab or the lab reagents. These controls don't offer protection against contamination occurring before the samples arrived at the PCR lab. Negative controls also can't rule out contamination of individual samples. The individual samples lack individual signs of contamination if it occurs. Unlike a human patient, a PCR is incapable of showing signs of infection (contamination) such as fever or undue pain. PCRs also have no immune system to ward off contaminants.”

"Tarnish on the Gold Standard,” by Professor William Thompson (University of California at Irvine) link to a number of articles
“While most of the problems are due to inadvertent mistakes, a number of cases involving dishonesty have also come to light…In all of these cases, the analysts were caught faking the results of control samples designed to detect instances in which cross-contamination of DNA samples has occurred.”

“In most instances, these errors produced unexpected results that flagged the problem, such as positive results in a control sample that was supposed to contain no DNA or a second DNA profile in a sample that was supposed to be from a single person. Upon noticing such problems, labs typically throw out the results of that test and start over.”

“Given the unexpectedly high frequency of contamination in DNA testing we have just discussed, it is interesting, and not at all surprising, that the major form of fakery discovered to date involves control samples known as extraction blanks that are designed to detect contamination. These samples are supposed to contain no DNA. When they produce positive results, it indicates there was a problem — DNA somehow ended up in a sample where it did not belong. If that happened to a control sample, it could also have happened to other samples, so the analyst must throw out the whole test and start over.”
 
You have mentioned DNA evidence alone being useless in the context of excluding suspects.

Well, no. DNA evidence alone is useless for conviction, you need to at least show the person didn't have an alibi and is a possible culprit. It can be used to exclude a suspect, in a trivial example you'd be dealing with a rape case and have semen available (a very good donor, for obvious reasons), but the suspect you're investigating wouldn't have the same DNA profile as the semen. The victim survived and identified a possible culprit from a photo line-up, that's the only reason why you investigated him as the suspect.

Any sensible investigator would conclude this man is not the culprit and that you're dealing with a case of mistaken identity. The culprit probably looks like him. If you want to be extra careful you'll ask for a semen sample to exclude chimerism (where one person is actually a cojoined, assimilated twin), but that's extraordinarily rare, with about six known cases in 70 years.

I think the question is what happens when somebody has no alibi and can't be excluded, but the DNA is the only evidence against them (or other evidence is weak and only discovered as a result of the DNA match). It may not happen often but has happened in some of the cases discussed.

Wait, evidence discovered as a result of DNA match, if the match was attained lawfully (that's ... not always the case), can be used against the suspect without issue.

DNA alone can't be used to convict. DNA plus a lack of alibi plus weak other evidence, even if the DNA match led to it, can.

McHrozni
 
I have collected a few quotes to explain what negative controls are, although more could be said on this subject. All three articles are worth reading in their entirety.

Reagent blank is a sample in which you do not add a compound to be detected. It works the same if you're working with DNA or looking for pesticides in drinking water. It's used to detect an ongoing, persistent contamination of the apparatus. There's also a sample blank, which is the sample without reagents, but that's not useful in DNA.

I'm pointing it out because it's a good thing to keep in mind when reading the articles. It's an ubiquitous concept throught the relevant sciences, DNA is a tad special because a PCR based method will readily detect a single molecule of contaminant. The articles are well worth the read, but do keep in mind this is not something unique that only DNA guys need to worry about.

Again, you do not prevent contamination via negative control. You detect an ongoing contamination with it. What prevents contamination with a known sample is competent handling of the samples - separate rooms for known and unknown samples, handling both in separate batches, making sure those paths do not cross, having different people do them, daily or more frequent decontamination and so on.

These protocols are multiple redundant, you need to make several major errors to theoretically introduce a contamination. Even if all of them fail, even if you deliberately pippete the crime scene sample right next to a known sample of a suspect to be investigated (that's a "you're fired!" on the spot), you're still unlikely to produce a contamination, because you won't pipette it with the same tip. If you do s o the transfer of DNA can be considered deliberate, it's a reflex ingrained in everyone to change the pipette tip after each action (and in between, just in case). This is not limited to DNA but it is most critical when working with small samples of DNA.

That's why it is my opinion contamination with a known sample in a laboratory is only possible if it was deliberate, or else if the laboratory has a history of such results. Deliberate contamination cannot be excluded, but a single case of contamination with a known sample occuring by chance or incompetence can. Unless the laboratory had several instances of false positive results of DNA tests the incidental cross-contamination is excluded as a possibility.

McHrozni
 
Last edited:
John Puckett, Raymond Easton, and Stephen Myers

Offhand I can think of at least three ways in which an innocent person might be seemingly inculpated by DNA evidence: a coincidental match of a partial profile, contamination in the narrow sense of a laboratory event, and sample mix-ups. All three have happened. Chapter 7 in Erin Murphy's indispensable book Inside the Cell is called dangers of the database, and she gives several illustrative examples, going into the greatest depth on the John Puckett case (which may deserve its own thread). She also discusses the Arizona database search that surprised some. The following are quick summaries.

Raymond Easton was identified and arrested on the basis of a six-locus match. He had late state Parkinson's and could not have committed the crime (I mentioned this case at my blog). He was eventually exonerated by further testing. Along with the Easton case, the Stephen Myers case also illustrated the problem that investigators are slow to conclude that DNA has pointed to the wrong person. The Newark, Ohio burglar was short, stout, and balding whereas Mr. Myers was tall, slender, and was fifteen years old at the time of the offense. From what I can gather, he lived in a neighboring state, West Virginia. Professor Murphy wrote, "Investigators waved away the physical discrepancy by noting the tappearanced change."

In the John Puckett case, there was a drop of blood from another suspect, but when his attorney asked that it be tested, it had gone missing from the case file. On the basis of the available evidence, Mr. Puckett was not obviously innocent, but neither was he guilty beyond a shadow of a doubt. The Puckett case has attracted the interest of at least one other academician besides Professor Murphy, but that will have to wait for another time.

With respect to laboratory contamination, the Farah Jama and Jaidyn Leskie cases are good examples that it does occur. IIRC Mr. Jama spent time in pretrial detention, although I am certain that he was cleared eventually. IIUC, Ms. P (the unknown person whose DNA ended up on items of evidence in the Leskie case and herself a possible crime victim) was quickly cleared of involvement. The previously mentioned Rex and Pad murder investigations in New Zealand were also instances of laboratory contamination.

Whenever there is a cold case hit, the question should arise as to whether or not the police investigation will clear an innocent person. This quick survey suggests a mixed record.
 
Last edited:
false cold hits and cross-contamination

In a book chapter called "Forensic DNA Evidence," William Thompson wrote, "Cross-contamination is also known to have caused a number of false cold hits. For example, while the Washington State Patrol Crime Laboratory was conducting a “cold-case” investigation of a long-unsolved rape, it found a DNA match to a refer- ence sample in an offender database, but it was a sample from a juvenile offender who would have been a toddler at the time the rape occurred. This prompted an internal investigation at the laboratory that concluded that DNA from the offender’s sample, which had been used in the labo- ratory for training purposes, had accidentally contaminated samples from the rape case, producing a false match.12 Similar errors leading to false database matches have been reported at forensic DNA laboratories in California, Florida, and New Jersey, as well as in New Zealand and Australia.13 Three separate cases have come to light in which cross- contamination of samples at the Victoria Police Forensic Services Centre in Melbourne caused false cold hits. Two of those cases led to false convictions.14"

In "The Potential for Error in Forensic DNA Testing (and How That Complicates the Use of DNA Databases for Criminal Identification)," Professor Thompson wrote, "He [a forensic scientist who had previously worked in a public forensic laboratory] told The Australian newspaper that it was not uncommon for the lab to mix up DNA samples from different cases.[62] For example, he said that analysts’ own DNA, from blood samples used as analytical controls, often was mixed up with (or found its way into) casework samples, creating false matches: “[Q]uite often my (colleague) would walk down the aisle and say, ‘I’ve just committed another rape on the Gold Coast.’”[62]" This paper was written in 2008 as part of a conference, Forensic DNA Databases and Race: Issues, Abuses and Actions held June 19-20, 2008, at New York University.
 
Last edited:
Offhand I can think of at least three ways in which an innocent person might be seemingly inculpated by DNA evidence: a coincidental match of a partial profile, contamination in the narrow sense of a laboratory event, and sample mix-ups. All three have happened.

"Have happened" and "do happen nowadays" are two different things. Planes have fallen out of the sky due to metal fatigue, that doesn't mean it's a plausible explanation for what happened to MH17.

With respect to laboratory contamination, the Farah Jama and Jaidyn Leskie cases are good examples that it does occur. IIRC Mr. Jama spent time in pretrial detention, although I am certain that he was cleared eventually. IIUC, Ms. P (the unknown person whose DNA ended up on items of evidence in the Leskie case and herself a possible crime victim) was quickly cleared of involvement. The previously mentioned Rex and Pad murder investigations in New Zealand were also instances of laboratory contamination.

The two events where DNA contamination within the laboratory is suspected (though not proven) happened 20 and 24 years ago, respectively. It is entirely plausible extraction and amplification protocols were inadequate and contamination did occur. It is not hard to do so, IF the protocols leave enough to be desired.

No later than 2010, and probably years prior, adequate protocols were in place for accidental laboratory contamination to be non-issue. Deliberate contamination is possible, crime scene accidental contamination (i.e. Knox case) is possible, but accidental laboratory contamination is not plausible.

McHrozni
 
In a book chapter called "Forensic DNA Evidence," William Thompson wrote, "Cross-contamination is also known to have caused a number of false cold hits. For example, while the Washington State Patrol Crime Laboratory was conducting a “cold-case” investigation of a long-unsolved rape, it found a DNA match to a refer- ence sample in an offender database, but it was a sample from a juvenile offender who would have been a toddler at the time the rape occurred. This prompted an internal investigation at the laboratory that concluded that DNA from the offender’s sample, which had been used in the labo- ratory for training purposes, had accidentally contaminated samples from the rape case, producing a false match.12 Similar errors leading to false database matches have been reported at forensic DNA laboratories in California, Florida, and New Jersey, as well as in New Zealand and Australia.13 Three separate cases have come to light in which cross- contamination of samples at the Victoria Police Forensic Services Centre in Melbourne caused false cold hits. Two of those cases led to false convictions.14"

It would be helpful if you could link to these cases. The cross-contamination did happen before it was sufficiently understood. It also explains how it looks like, laboratory either has contamination in several random cases (cold hits), or else none at all.

In "The Potential for Error in Forensic DNA Testing (and How That Complicates the Use of DNA Databases for Criminal Identification)," Professor Thompson wrote, "He [a forensic scientist who had previously worked in a public forensic laboratory] told The Australian newspaper that it was not uncommon for the lab to mix up DNA samples from different cases.[62] For example, he said that analysts’ own DNA, from blood samples used as analytical controls, often was mixed up with (or found its way into) casework samples, creating false matches: “[Q]uite often my (colleague) would walk down the aisle and say, ‘I’ve just committed another rape on the Gold Coast.’”[62]" This paper was written in 2008 as part of a conference, Forensic DNA Databases and Race: Issues, Abuses and Actions held June 19-20, 2008, at New York University.

This is another question entirely, sample contamination by a laboratory worker or a crime scene technician is a common problem that does happen from time to time. Laboratories keep DNA profiles of all employees at hand for that eventuality.

McHrozni
 
some links to Professor Thompson's works

William Thompson's 2008 article "The Potential for Error in DNA testing..." can be found here. It can also be obtained through Research Gate.

Professor Thompson's 2012 chapter "Forensic DNA Evidence: The Myth of Infallibility" is available at Research Gate and through SSRN.

A number of articles by Professor Thompson and others are available at the Forensic Bioinformatics Resources page.
 
William Thompson's 2008 article "The Potential for Error in DNA testing..." can be found here. It can also be obtained through Research Gate.

This ... isn't a scientific article. I don't know what it is, but given the content within it's a general interest article that wants to give the impression of being a scientific article. It leaves a whole lot out, laboratories don't just feed the kit into a machine and read out the numbers, yet that's the impression I got within the first few pages. You get the elecropherogram and are expected to figure out whether the sample is degraded or not (NOT difficult) and whether or not you should run the sample again, with a kit specialized for degraded samples (yeah, those exist ... and were common already in 2008).

The next issue is mixtures. Yes, mixtures can be a problem, but he leaves out the two ways used to solve them. The first is that it often happens the quantities of the two DNA profiles differ and it's possible to entirely separate the two profiles. If the samples are present in, say, a ratio of 10:1 you can separate the taller peaks from smaller ones with ease. The second solution is that we do know how often do certain allelles appear in the population. A 1:1 mixture of two DNA samples, where one person could be the donor, is still attributable to that person as a one in a million match or so. That's because the number of people that could give that exact same mixture is so low. That's why we use 13 (now often 16) loci in the first place.

These are known solutions to the problems that have been routine by 2008, when the article was produces.

The next up is the probability of two people sharing the same 13-loci profile that the author considers worthy to mention, because there are so many people to compare them to. I think the greatest known match is about 7-8 loci, and even that was in an inbred population of European gypsies (Roma). I use inbred as a description of their marriage patterns, not as some form of insult. They do that and it's a problem for forensics. Birthday paradox is mentioned, but that's just the odds of ANY two people in a room sharing the same birthday. This is the odds of THIS person sharing a genome with someone else, a completely separate statistical question. This is neither explained, nor discussed in any depth.

Furthermore, the author claims it is possible for two siblings to have the same STR profile. This is true, odds of a match to a sibling are significantly greater than those with a random population. However this was a solved issue long before 2008: you don't convict based on DNA alone. At a minimum you have to show the alledged culprit had no alibi and some other ancilliary evidence, motive and such (it depends a bit on the crime, a bulgrary comes with a built-in motive, a murder does not). The author mentions several times when such methods worked, they resulted in an unnecessary arrest at worst. That is annoying, but there's usually a rather good reason why your DNA profile is in the database anyway.

Or this one:
"Clothing the person wore, a cigarette the person smoked, a glass the person drank from could all, if placed at a crime scene, create a false DNA link between an innocent person and a crime."

It was routine in 2008 for criminals to claim this exact thing happened to them and the police had to figure it out. That's one of the reasons why DNA isn't a miracle silver bullet, merely a tool in the database. The usual standard is for the suspect to exclude the possibility their DNA has been there in the first place, then present them with evidence they're lying. The sought after result is a confession and a plea deal, not a jury trial.

McHrozni
 
Cases of Gregory Turner and Kevin Brown

This is another question entirely, sample contamination by a laboratory worker or a crime scene technician is a common problem that does happen from time to time. Laboratories keep DNA profiles of all employees at hand for that eventuality.

McHrozni
It would be a mistake to draw a bright line between laboratory workers contaminating a sample with their own DNA with contaminating it with a suspect's DNA. For one thing, consider the Gregory Turner case. The victim's DNA was found was found on his wedding ring. However,

"A lucky hunch by Mr. Kennedy - now Newfoundland's Minister of Health - saved Mr. Turner from a life behind bars. He sought the name and DNA profile of every technician who had worked at the RCMP lab. It turned out that the technician who had tested the ring had also been working on the victim's fingernails a few inches away, creating a strong possibility of contamination.
The technician conceded at Mr. Turner's 2001 trial that she had also contaminated evidence in two previous cases. In another disturbing twist, it emerged that she had mistakenly contaminated Mr. Turner's ring with her own DNA, causing police to waste considerable time on a futile search for a presumed accomplice.

Mr. Turner still has nightmares. "I remember the judge saying that he was denying me bail based on the likelihood I'd be convicted based on a DNA match," he said. "I think DNA can be good, but its only as good as the people who perform it. I spent 27 months in jail for a crime I didn't do."
In just 20 years, DNA has become a staple of crime-lab analysis, capturing the imagination of scriptwriters and anchoring thousands of criminal convictions. Its record of accuracy is superb - at least, when samples are collected and analyzed under reliable conditions by experts."

For another, consider the sad case of laboratory worker Kevin Brown, who probably contaminated a sample and yet became a suspect. IIUC he later committed suicide.

Why the defense in this case did not call an expert to walk the jury through instances of laboratory contamination like Mr. Turner's is puzzling.
 
Russell Gesah, Farah Jama, and Gary Leiterman

The two events where DNA contamination within the laboratory is suspected (though not proven) happened 20 and 24 years ago, respectively. It is entirely plausible extraction and amplification protocols were inadequate and contamination did occur. It is not hard to do so, IF the protocols leave enough to be desired.

No later than 2010, and probably years prior, adequate protocols were in place for accidental laboratory contamination to be non-issue. Deliberate contamination is possible, crime scene accidental contamination (i.e. Knox case) is possible, but accidental laboratory contamination is not plausible.

McHrozni

Some labs have instituted "more stringent cleaning processes," according to the Sydney Morning Herald. However, beyond that I see no reason for optimism. If anything, low template DNA work makes laboratory contamination more likely than it would otherwise be. In the Knox Sollecito case, the bra clasp may have been contaminated during its collection, but the kitchen knife was almost certainly an instance of laboratory contamination, of which there have been many examples. In the Leskie case work by Professor Thompson's associate, Professor Dan Krane, was one of the things that made laboratory contamination a more plausible explanation than a coincidental match. Reports on the Leskie case by both professors are worth reading and can be found at a link previously given. Even the coroner conceded that this was laboratory contamination

The case of Russell Gesah is also instructive. "Weeks after crediting DNA technology with a breakthrough in the 24-year-old case, police yesterday announced they had withdrawn the charges against Russell John Gesah because the evidence they were relying may have been contaminated." link

The case of Farah Jama, which I also covered at my blog, is another instance of laboratory contamination. "This chapter is concerned with an emerging third strand of the narrative: DNA’s potential to conceal truths and create falsehoods. Recently this dark side of DNA has become prominent in the Australian state of Victoria, where it is now associated with one name. Farah Jama was convicted solely on the basis of a matching DNA profile, with tragic repercussions."

I have written about the Gary Leiterman case here and elsewhere on several occasions. Let me focus on one issue right now. John Ruelas was four years old and did not live in Ann Arbor, MI. His DNA could not have appeared in the Jane Mixer case in any reasonable except laboratory contamination

These cases illustrate that in a cold case the suspect usually does not have an alibi, that DNA alone is enough to convict, and that juries will convict in spite of evidence of laboratory contamination; these points may be relevant to the present case.
 
Last edited:
Some labs have instituted "more stringent cleaning processes," according to the Sydney Morning Herald. However, beyond that I see no reason for optimism. If anything, low template DNA work makes laboratory contamination more likely than it would otherwise be. In the Knox Sollecito case, the bra clasp may have been contaminated during its collection, but the kitchen knife was almost certainly an instance of laboratory contamination, of which there have been many examples. In the Leskie case work by Professor Thompson's associate, Professor Dan Krane, was one of the things that made laboratory contamination a more plausible explanation than a coincidental match. Reports on the Leskie case by both professors are worth reading and can be found at a link previously given. Even the coroner conceded that this was laboratory contamination

The case of Russell Gesah is also instructive. "Weeks after crediting DNA technology with a breakthrough in the 24-year-old case, police yesterday announced they had withdrawn the charges against Russell John Gesah because the evidence they were relying may have been contaminated." link

The case of Farah Jama, which I also covered at my blog, is another instance of laboratory contamination. "This chapter is concerned with an emerging third strand of the narrative: DNA’s potential to conceal truths and create falsehoods. Recently this dark side of DNA has become prominent in the Australian state of Victoria, where it is now associated with one name. Farah Jama was convicted solely on the basis of a matching DNA profile, with tragic repercussions."

These two case illustrate respectively, that in a cold case one usually does not have an alibi and that DNA alone is enough to convict; the former may be relevant to the present case.

I thought in the Farah Jama case there was even exculpatory evidence (including alibi evidence) that was ignored.

In the current case, I was wondering how the use of a LCN method could contribute to the risk of a match being the result of contamination. An argument was that it is unlikely contamination of just one critical sample would occur (assuming all cases of contamination are in fact recorded). Is it possible that contamination of other samples could be undetected if they weren't analysed with the same methods?
 
Farah Jama; low template samples and the increased risk of contamination

I thought in the Farah Jama case there was even exculpatory evidence (including alibi evidence) that was ignored.

In the current case, I was wondering how the use of a LCN method could contribute to the risk of a match being the result of contamination. An argument was that it is unlikely contamination of just one critical sample would occur (assuming all cases of contamination are in fact recorded). Is it possible that contamination of other samples could be undetected if they weren't analysed with the same methods?
Justice Frank Vincent wrote a report on the Farah Jama case that deserves to be re-read and quoted frequently. There were various kinds of exculpatory evidence: IIUC e was praying with his father at the time of the incident. However, his father was very sick and may have passed away not long after. I discussed his case here and here. The following two paragraphs come from the second link:

The discussion of low copy number (LCN) testing from the Crown Prosecution Service noted, “This increased sensitivity means ultra-clean laboratories are needed for the testing to minimise contamination of the sample by DNA from any other source.” The New Zealand Institute of Environmental Science and Research has spent $1 million building anticontamination areas for low copy number (LCN) DNA forensics. The New Zealand Herald wrote, “The bogey is contamination. The very sensitivity of the technique which enables it to extract a DNA profile from the tiniest sample also makes it extremely vulnerable to contamination. Stringent measures are needed to minimise that risk… We live in a ‘soup’ of DNA, explains ESR forensic programme manager Keith Bedford. ‘If I were to shed dandruff, massive amounts of dna could fall ... hair could carry DNA. The way I am speaking at the moment, we could probably detect DNA on this pad in front of me.’”

Sara Gino testified for the defense in the trial of the first instance, and some of what she had to say is pertinent to this issue. From the Massei report (p. 258, English translation): “She reaffirmed that [the risk of] contamination exists, and emphasised that in minimal quantities of DNA there is not necessarily a greater risk of contamination but it was easier to notice the effects of the contamination and be misled (‘...It's not that the risk of contamination is greater; but it is easier to see the contamination...’ page 92).” In response to a question on this subject, Professor Dan Krane responded, “There is absolutely no question but that contamination is a much greater problem in LCN cases than conventional DNA testing. The reasons that it is a greater problem are both because it is easier to detect contaminants ([Sara] Gino's point) and because it is easier to transfer (and to transfer without knowing) smaller amounts of DNA than larger amounts of DNA.”

As for the size of a sample, low-template profiling works on a dozen cells or less. Let me offer a simple analogy. If I go to the beach on a windy day, I don't expect to be hit with pebbles, but I do expect to feel sand hit me.
 
Who watches the technicians?

The next issue is mixtures. Yes, mixtures can be a problem, but he leaves out the two ways used to solve them. The first is that it often happens the quantities of the two DNA profiles differ and it's possible to entirely separate the two profiles. If the samples are present in, say, a ratio of 10:1 you can separate the taller peaks from smaller ones with ease. The second solution is that we do know how often do certain allelles appear in the population. A 1:1 mixture of two DNA samples, where one person could be the donor, is still attributable to that person as a one in a million match or so. That's because the number of people that could give that exact same mixture is so low. That's why we use 13 (now often 16) loci in the first place.

These are known solutions to the problems that have been routine by 2008, when the article was produces.

McHrozni
Let me begin with mixtures then move on. Once you get to a 10:1 mixture or so, it becomes more difficult to differentiate stutter peaks from real peaks. More generally, analysis of mixtures is prone to subjectivity and bias. A good example of a suspect-centered analysis in the work that Stefanoni did with respect to the bra clasp profile in the Knox/Sollecito case. She found Sollecito's profile, yet other peaks she called stutter, even though some would have had to have been backwards stutter peaks (which is quite uncommon).

That Professor Thompson knows about these sorts of problems is suggested by the fact that he is a coauthor on a paper (see this link for a page of downloads) that advocates sequential unmasking as a way to minimize bias in the interpretation of mixtures, a well known problem. With respect to Professor Thompson's qualifications more generally, see this link. Until his retirement, my understanding is that his area of specialty was probability and decision making. He was not a practicing DNA forensic technician, as his report in the Leskie case made clear. However his expertise complemented Professor Krane's expertise.

Now I would like to address more generally the need to have lawyers oversee forensics. IIRC Professor Thompson and Professor Paul Giannelli were members of the committee that drafted the ABA's model rules on DNA evidence, a thoughtfully written statement from which I often quote. Professor Giannelli's book chapter on the Duke lacrosse case provides a pedagogically clear description of DNA forensics, one that I consulted as I was preparing lectures on the subject of DNA forensics. A draft of this chapter was available on the web some time ago. Professor Erin Murphy's book, which I previously called indispensable, goes into many issues relevant to the present case, as well as others.

Why does forensics need oversight from lawyers such as Professor Thompson and others? First, because it blends law and science. Such a statement may seem banal, but this overlap is more complex than it first appears. In addition, the technicians do not always come out looking good. See for example the article "Painting the target around the matching profile: The Texas sharpshooter fallacy," by William Thompson. Second, because some labs have shown themselves to be error-prone or dishonest and because the reports from lab or the testimony from lab personnel is incomplete or false.

Let me provide just a few examples of the second reason. As Professor Thompson documented in "Tarnish on the Gold Standard," the negative controls have been shown to have been faked (Many of Professor Thompson's articles are available through the Forensic Bioinformatics page on resources, for which I have previously given the link). Others documented that forensic reports in the Duke lacrosse and Steven Avery cases were shown to have been incomplete and highly misleading. The Gary Letterman case shows that lab personnel claim no contamination even when audits of their documents show otherwise. The Theodore Kessis Report on this case might be consulted for more information.

These behaviors are object lessons in how not do forensic science or any other science. The defense in the present case should have asked Professor Murphy or any of the other people whom I have mentioned to testify to such problems.
 
Last edited:

Back
Top Bottom