The Unscientific Nature of Modern Forensic Sciences

by Alexander Goldstein, Citations Editor, University of Cincinnati Law Review Vol. 91

 

I. Introduction

Due to advancements in modern forensics, many Americans believe that the evidence in a criminal trial does not lie. When an analyst says that they have conclusively matched the suspect’s fingerprints to the prints on the murder weapon, people believe that science has proven guilt beyond a reasonable double. The gatekeeper for the reliability of these modern sciences are the trial judges. This unbiased and modern approach to evidence has been hailed as science realizing modern justice. Ironically, the opponent of this modern scientific approach is, in fact, the scientific community themselves. This article explores the unscientific nature of modern forensic sciences. Section II will explore how the definition of scientific validity differs wildly between the legal and scientific community. Section III discusses how the discrepancy between what the legal system accepts as science and the standards of the scientific community creates a due process violation whenever forensic evidence is admitted to trial. Section IV will conclude.

II. Background

A. The Legal Community’s Definition of Scientifically Valid

The admissibility of scientific evidence into trial is controlled by the Supreme Court case Daubert v. Merrill Dow Pharmaceuticals.1509 U.S. 579 (1993). In Daubert, the plaintiffs claimed that the drug Bendectin was the cause of their birth defects. Prior to Daubert, the admissibility of scientific evidence was governed by the General Acceptance test. Utilizing that standard, the trial court ruled that despite the plaintiff supplying several experts that would testify that Bendectin could cause birth defects, such testimony had not gained general acceptance in its field.2Id. at 582-83. On appeal, the Supreme Court noted that the introduction of the Federal Rules of Evidence (“FRE”), especially FRE 702, mandated a different standard for evidence that deviated from prior case law.3Id. at 587-89. Under Daubert, the new standard would require a judge to screen incoming evidence for admissibility.4Id. This screening utilizes a non-exhaustive list of considerations, including whether the theory or technique: can or has been tested, has been subject to peer review, has a known error rate, or has garnered widespread acceptance.5Id. at 593. Although widespread acceptance might seem identical to the General Acceptance test, the two have key differences.6For a comparison see generally Frye v. United States, 293 F. 1013 (D.C. Cir. 1923). Daubert is currently the majority controlling test for the admissibility of scientific testimony.

The Court expanded upon this admissibility standard in a later ruling. In Kumho Tire Co. v. Carmichael, a plaintiff sued a tire manufacturer and distributor, alleging the product was defective.7Kumho Tire Co. v. Carmichael, 526 U.S. 137, 144-45 (1999). Defendants moved to exclude the testimony of a plaintiff’s expert on the grounds that his expertise was technical, not scientific, thus violating FRE 702.8Id. at 145. The district court ruled that it would apply Daubert to the expert even though his testimony was more technical than scientific.9Id. It also ruled that this particular expert’s methods satisfied none of four criteria outlined in Daubert and granted the motion to exclude.10Id. at 145. The district court did concede the motion for reconsideration. On that motion, the court conceded that its application of Daubert was too strict. However, it affirmed the barring of the expert testimony based on other considerations beyond the four factors found in Daubert. Id. The appellate court reversed and the Supreme Court granted certiorari. In its decision, the Court reiterated the gatekeeper standard outlined by Daubert.11Id. at 47; see also id. at 152. It also noted that the list of considerations for admissibility outlined in Daubert were permissive and subject to the judge’s discretion, noting that judges could utilize any, even none of the Daubert factors,12Id. at 150-51. This heavily implies that they do not need to accept any. or, as the district court did, utilize other factors.13See id. at 156. The Court stated:

[W]e can neither rule out, nor rule in, for all cases and for all time the applicability of the factors mentioned in Daubert . . . . [Daubert] made clear that its list of factors was meant to be helpful, not definitive. Indeed, those factors do not all necessarily apply even in every instance in which the reliability of scientific testimony is challenged.14Id. at 150-51.

In a concurrence, Justice Scalia noted that “the Daubert factors are not holy writ, in a particular case the failure to apply one or another of them may be unreasonable, and hence an abuse of discretion.”15Id. at 159 (Scalia, J., concurring in judgement). The Court ruled that Daubert applied to “scientific, technical and other specialized knowledge”16Id. at 149. and that its factors were permissive based on the trial courts discretion, overruling the appellate court.17Kumho Tire expanded Daubert to anyone who would testify in a forensic capacity, regardless of whether they are testifying to scientific knowledge or simply as an expert. Id. at 148.

B. The Scientific Community’s Definition of Science

Overall, Daubert makes it very clear judges have immense discretion in what scientific evidence will be allowed in the courtroom. Through Daubert, fingerprint analysis, DNA evidence, and tool-mark impressions have all been deemed scientifically valid by the legal community and have been used to convict criminals. Daubert is flexible and all scientific evidence is subjected to the standards of the individual judge considering its admissibility. This flexibility and subjectivity are not permitted in the scientific community.

In a 2016 report, the President’s Council of Advisors on Science and Technology (“PCAST”) found methodological problems with several of the major forensic science disciplines: DNA, hair, latent fingerprints, firearms and spent ammunition, toolmarks and bitemarks, shoeprints and tire tracks, and handwriting.18President’s Council of Advisors on Science and Technology, Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods 2 (Sept. 2016), https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast_forensic_science_report_final.pdf. The PCAST report analyzed the data it collected and found that what judges admitted as scientifically valid was incongruous with the scientific community’s definition of scientifically valid. The PCAST report noted that some scientific experts testified that their methods were either without error or that the error was negligible.19Id. at 29. The report specifically mentioned experts’ claims that their findings were “100 percent certain;” have “zero,” “essentially zero,” “vanishingly small,” “negligible,” “minimal,” or “microscopic” error rate; or have a chance of error so remote as to be a “practical impossibility.” Id. While a judge could find such claims have merit under the Daubert standard,20Kumho Tire Co. v. Carmichael, 526 U.S. 137, 150-51, 156 (1999). outside of the courtroom such statements are commonly deemed “not scientifically defensible.”21President’s Council of Advisors on Science and Technology, supra note 17, at 3. This is because all tests have a non-zero error rate and the description “to a reasonable degree of scientific certainty” has no common meaning in the field of science.22Id. at 29-30. This criticism highlights the difference in what judges deem to be scientifically valid in a legal context, and what scientists consider to be scientifically valid.

1. Cognitive Bias

The PCAST report highlighted several areas where this disparity exists. Three particular areas where the Daubert standard falls short of the scientific validity in the scientific community are cognitive bias, unknown human error, and inconsistency across examiners.23Id. at 5.

Cognitive bias refers to a phenomenon where individual perceptions and judgements can be altered by external facts or factors related to the analysis.24Id. at 31. Although cognitive bias can take many forms, there are three primary forms within forensic science: contextual bias, confirmation bias, and avoidance of cognitive dissonance. Contextual bias is produced by the introduction of extraneous background information.25Id. Confirmation bias occurs when one’s analysis is biased by a pre-existing belief or assumption.26Id. Finally, the avoidance of cognitive dissonance refers to a person’s disbelief towards new information that challenges their prior conclusion.27Id. Cognitive bias can pollute an expert’s reasoning and prejudice their interpretation of data. Because cognitive bias is subconscious, conscious efforts to control it are ineffective. However, scientists utilize methods, such as double-blind testing, to effectively minimize this subconscious bias.28Id. Unlike the scientific community, the forensic sciences do not stress the usage of methods to avoid cognitive bias.

The scientific community has researched the effect of cognitive bias in the forensic sciences and published what effect it has on scientific validity.29Id. Professor Dror has done multiple studies and thoroughly documented the effects of cognitive bias in the forensic science field. In 2005, Professors Dror, Péron, Hind, and Charlton conducted a follow up study on the effect of emotional background stories and subliminal messages on fingerprint analysis. In this study, fingerprint comparisons were subjected to four incremental degrees of bias—control group, low emotion, high emotion, and high emotion combined with subliminal top-down manipulations. The results were measured as the percentage of declared matches in each category, thus allowing a representation of how the biasing factors affected the analysis. The study found that when the fingerprints being compared were non-ambiguous,30Fingerprints can be separated into basic groups by the pattern formed around the center of the print. Two different patterns would be nearly impossible to fail to distinguish, while two different prints that share the same basic structural pattern would appear more similar. As an example, a non-ambiguous comparison would be akin to comparing different fonts of two different letters. It is very difficult to mistake letters regardless of font. the identifications had very little variance (50%, 54%, 51%, and 46% on a sliding scale of higher biasing factors). However, when the fingerprints were ambiguous the match rates became 46%, 49%, 58%, and 66%. The study concluded that biasing factors significantly altered the results.31Itiel E. Dror et al., When Emotions Get the Better of Us: The Effect of Contextual Top-down Processing on Matching Fingerprints, 19 Applied Cognitive Psych. 799, 806-09 (2005). In another study, Professor Dror found that a forensic pathologist’s determination of whether a child’s death was an accident or a homicide was affected by knowledge of the child’s race, exposing a bias within a field that was thought to utilize objective feature comparison.32Itiel Dror et al., Cognitive Bias in Forensic Pathology Decisions, 66 J. Forensic Sci. 1751, 1754-55 (2021) (concluding that when presented with identical reports, forensic pathologists were more likely to consider the child’s death a homicide if they were white).

Furthermore, the FBI acknowledged the existence of bias in a high profile case as far back as 2006.33See generally U.S. Dep’t of Just. Off. of the Inspector Gen., A Review of the FBI’s Handling of the Brandon Mayfield Case (Mar. 2006). In that particular case, the FBI declared that a fingerprint found on site of a terrorist attack in Madrid, Spain matched to a lawyer from Portland, Oregon.34Id. at 1-2. However, the lawyer had never been to Spain,35Sarah Kershaw et al., Spain and U.S. at Odds on Mistaken Terror Arrest, N.Y. Times (June 5, 2004), https://www.nytimes.com/2004/06/05/us/spain-and-us-at-odds-on-mistaken-terror-arrest.html. and the Spanish National Police matched the fingerprint to another person.36U.S. Dep’t of Just. Off. of the Inspector Gen., supra note 32, at 3. The FBI conducted a review of their procedures and concluded that bias had partially caused the mistaken identification.37Id. at 3-4, 7.

Although the foundational validity of latent fingerprint analysis is apparently solid (albeit with a substantial false positive rate),38Id. at 101. its validity as applied is highly undermined by inadequate measures to protect against cognitive bias.39See Robin Mejia et al., Implementing Blind Proficiency Testing in Forensic Laboratories: Motivation, Obstacles, and Recommendations, 2 Forensic Sci. Int’l: Synergy 293, 294-98 (2020). A recent study found that independent examination of latent fingerprint results caught all false positives and a majority of false negatives. Bradford T. Ulery et al., Accuracy and Reliability of Forensic Latent Fingerprint Decisions, 108 Proc. Natl. Acad. Sci. USA 7733, 7733 (2011) (demonstrating that latent fingerprint analysis can become more reliable than it currently is).

2. Known Error Rate

As noted previously, the PCAST report documented common phrases used by forensic experts that are not scientifically valid, like references to negligible or nonexistent error rates.40See President’s Council of Advisors on Science and Technology, supra note 17. Such false statements are compounded by the overestimation of the probative value of forensic sciences.41President’s Council of Advisors on Science and Technology, supra note 17, at 46; see also Itiel Dror, The Error in “Error Rate”; Why Error Rates Are So Needed, Yet So Elusive, 65 J. Forensic Sci. 1034, 1034-35 (2020) (stating knowledge of error rates is a fundamental aspect of good science). It is important to reiterate that no test has a non-zero error rate.42See id. at 104; see also Robin Mejia et al., Implementing Blind Proficiency Testing in Forensic Laboratories: Motivation, Obstacles, and Recommendations, 2 Forensic Sci. Int’l: Synergy 297 (2020). An error rate is a measurement of how often a method comes to the incorrect result. In science having a known error rate is just as important as knowing how often a method is correct. Without knowing the error rate of the forensic method used, any legal conclusion of a feature comparison is scientifically meaningless. Simply put, the conclusion that two fingerprints match is not “probative” without a known error rate.43President’s Council of Advisors on Science and Technology, supra note 17, at 46; see also Itiel Dror, The Error in “Error Rate”; Why Error Rates Are So Needed, Yet So Elusive, 65 J. Forensic Sci. 1034, 1034-35 (2020) (stating knowledge of error rates is a fundamental aspect of good science).

As an example, consider bitemark identification and simple mixture DNA analysis. If one expert testifies that a bitemark left on a victim matches the defendant’s bite, while a second expert testifies that the DNA from the saliva found on the victim does not match the defendant, the findings cannot both be true. But, when expert witnesses do not accurately state known error rates, judges and juries cannot tell which is more accurate.

In addition to accounting for human error, the scientific community considers false positive rates associated with forensic sciences.44President’s Council of Advisors on Science and Technology, supra note 17, at 50-51. Even well established and accurate error rates would be meaningless without a known false positive rate.45Jonathan J. Koehler, Fingerprint Error Rates and Proficiency Tests: What They are and Why They Matter, 59 Hastings L. J. 1077, 1079 (2008). “It does not even matter whether the chance of a coincidental match is zero (as implausible and unscientific as this value is) because in these situations, the false positive error rate limits and controls the probative value of the match report.”46Id. Therefore, without both a known error rate and a known false positive rate, forensic testimony is not supported by scientific validity.

If the statement that a forensic science has an “essentially zero”47Supra note 46. error rate is untrue, then what are the actual error rates?48See Sandra Guerra Thompson & Nicole Bremmer Cásarez, Solving Daubert’s Dilemma for the Forensic Sciences Through Blind Testings, 57 Hous. L. Rev. 617, 632 (2020) (“The lack of statistical support for most of the forensic disciplines continues to pose a conundrum for the courts: If the error rate is not zero, what is it?”). Studies conducted to determine the error rate of forensic sciences stand in stark opposition to anything even approaching an “essentially zero” error rate. For example, a study as far back as 1975 identified an error rate of 84% in bite marks that were older than twenty-four hours.49President’s Council of Advisors on Science and Technology, supra note 17, at 85. Even fingerprint analysis, the “second best” forensic method, has shown high error rates. Previous studies have found that experts will wrongly declare that a set of fingerprints match anywhere between 2.9% and 25.9% of the time.50Jonathan J. Koehler & Shiquan Liu, Fingerprint Error Rate on Close Non-matches, 66 J. Forensic Sci. 129, 130 (2021). A 2016 study found that finger print analysis had an error rate of 9.6% but the accuracy amongst individual experts varied widely,51Jennifer Mnookin et al., Error Rates for Latent Fingerprinting as a Function of Visual Complexity and Cognitive Difficulty, Nat’l Crim. Just. Reference Serv. 32 (May 2016), https://www.ojp.gov/pdffiles1/nij/grants/249890.pdf; see also Simon A. Cole, More than Zero: Accounting for Error in Latent Fingerprint Identification, 95 J. Crim. L. & Criminology 985, 1073 (2005) (finding examiner false positive rate amongst several studies to be 5.5%). demonstrating that some experts were more prone to errors than others. Overall, forensic sciences have anything but an “essentially zero” error rate.

The significance of this discrepancy is that under Daubert, forensic experts are able to claim that their methods have error rates that are both scientifically impossible and actually discredited by scientifically valid studies.  

3. Inconsistency Amongst Examiners

For a conclusion to be scientifically valid, it is essential that forensic experts who apply the same method achieve similar results.52President’s Council of Advisors on Science and Technology, supra note 17, at 56-57. Valid science utilizes a process called proficiency testing to confirm both accurate conclusions and consistency when applying a method.53Id. at 57. Proficiency testing itself must also be grounded in scientific validity rather than training or experience.54Id. at 61. Only where a method is consistently utilized can there be any determinable accuracy.55Id. at 56. The PCAST report cited a study regarding bite mark analysis which concluded that the inconsistent identifications given by examiners showed “a fundamental flaw in the methodology of bitemark analysis and should lead to concerns regarding the reliability of any conclusions reached about matching such a bitemark to a dentition.”56Id. at 85. A method that produces consistent results is scientifically valid, while a method that produces a myriad of different results—like bitemark analysis—is not.

III. Discussion

A forensic discipline is scientifically invalid if that discipline fails to avoid cognitive bias, establish known error rates, and utilize proficiency testing. The PCAST report found that all six forensic disciplines failed in at least one of these three areas.57Id. at 67-123. Yet, these six sciences are still admissible under Daubert. Given these findings, does admitting this scientifically invalid evidence violate due process? 

Daubert may be generally permissive in regard to the judge’s discretion to admit evidence,58Daubert v. Merrell Dow Pharmaceuticals Inc., 509 U.S. 579, 588-89 (1993) (comparing the “liberal thrust” of FRE 702 to Frye’s “rigid” and austere General Acceptance Standard (citing Beech Aircraft Corp. v. Rainey, 488 U.S. 153, 169 (1988))). but this wide latitude is limited by two narrow instances where the admission of evidence can violate due process. In Perry v. New Hampshire, the Supreme Court considered the Fifth and Sixth Amendment implications of eyewitness identification.59Perry, 565 U.S. at 232. In Perry, the defendant claimed that a witness identifying him as he stood next to an officer “amounted to a one-person show up”, which would guarantee an identification and violate due process.60Id. at 234-35. At trial, the superior court did acknowledge that the accuracy of the identification was questionable, but since it was not “unnecessarily suggestive” it would be for the jury to consider the reliability.61Id. at 235. On appeal to the Supreme Court, the Court noted that “[t]he Constitution, our decisions indicate, protects a defendant against a conviction based on evidence of questionable reliability, not by prohibiting introduction of the evidence, but by affording the defendant means to persuade the jury that the evidence should be discounted as unworthy of credit.”62Id. at 237. The Court noted that while statutes can govern the admissibility of evidence, the reliability of the evidence must be determined by the jury. The Court did qualify this statement by declaring that an admission of evidence would violate due process (1) where it is either so “extremely unfair that its admission violate[s] fundamental conceptions of justice” or (2) where the state knowingly uses false evidence since that would violate “any concept of ordered liberty.”63Id. at 237. The Court uses the same language of “ordered liberty” and that other case’s use of “fundamental justice” as it did Glucksberg. See Washington v. Glucksberg, 521 U.S. 702 (1997). This makes it irrefutably apparent that both instances under Perry qualify as a violation of due process. These are the two instances where admissibility can violate due process. The admission of scientifically invalid evidence implicates both.  

A. Violating the Fundamental Concept of Justice

The Court has done little to clarify exactly the standard of what constitutes a violation of the fundamental conception of justice in the context of evidence admission. As such, lower courts have had to define it themselves, leading to a circuit split.

Case law within the First Circuit has confirmed that the Court did not leave them with a clear definition of the fundamental conception of justice. In Torres v. Roden, the petitioner argued that Massachusetts evidence rules violated his right to due process because those rules violated the fundamental justice standard.64Torres v. Roden, No. 15-13598-LTS, 2017 U.S. Dist. Lexis. 186817, at *15-16 (D. Mass. Nov. 13, 2017). The district court noted that to violate the fundamental justice standard there would need to be an “extreme malfunction” of the justice system.65Id. at 16 (citing Burt v. Titlow, 571 U.S. 12, 18 (2013)). It further cited case law which noted that the admission of evidence was not guided by whether or not the admission was correct, but instead if it was “not so arbitrary or capricious” as to deny the defendant’s rights.66Id. at 17 (citing Coningford v. Rhode Island, 640 F.3d at 485 (1st Cir. 2011)). A violation of the fundamental concept of justice standard was interpreted as more than an incorrect admission of evidence, but rather a decision that appears to be devoid of any logical underpinnings or rational thought. Within those constraints, as long the judge’s admission of evidence is based on cognizant reasoning, then there will be no violation of due process. Therefore, so long as the judge can offer a reasonable explanation as to why they chose to admit evidence, then it is unlikely that the admission of scientifically invalid forensic evidence would constitute a due process violation within the First Circuit.

The Third Circuit offered a different rule that is more applicable to the admission of forensic testimony. Han Lee Tuk was convicted of arson and first-degree murder through arson evidence after his daughter died in a fire.67Lee v. Glunt, 667 F.3d 397, 400 (2012). His petition for habeas corpus was denied and he appealed. Lee argued that due process was violated when convictions were based on “inaccurate and unreliable evidence.”68Id. at 402. The court agreed and articulated that for the admission of evidence to “undermine fundamental fairness,” the prejudicial effect of the evidence must have greatly outweighed the probative value of the evidence.69Id. at 403. Note that is just the weighing analysis under FRE 403. See Perry v. New Hampshire, 565 U.S. 228, 232 (2012) and Fed. R. Evid. 403 (these factors are unfair prejudice, confusing the issues, misleading the jury, undue delay, wasting time, or needlessly presenting cumulative evidence). The court concluded that if Lee could prove that the fire science at his trial was fundamentally unreliable, then the admission of that evidence would give rise to a valid due process claim.70Id. at 407-08. The court concluded by saying the prejudice/probative balancing test combined with his claim of actual innocence. Id. at 400. This actual innocence is part of the habeas corpus requirements but is not required in the Federal Rules of Evidence regarding admissibility and is therefore not applicable to admissibility pre-conviction.

The circuits appear to disagree on whether the scientifically invalid forensic practices outlined in the PCAST report violate a fundamental notion of justice.

B. Admitting False Evidence

While there is a circuit split regarding the first prong of Perry, the second prong has a single interpretation. The second prong of Perry held that it is a due process violation for the State to proffer evidence that it knows to be false.71Perry, 565 U.S. at 237. In Perry, the Court cited a previous case, Napue v. Illinois.72360 U.S. 264 (1959). In Napue, a co-defendant testified against the defendant and claimed that he had received nothing in exchange for this testimony. However, the Assistant State Attorney had in fact promised consideration and did not correct the testimony at trial.73Id. at 266. In their decision, the Court noted that the State’s use of false evidence to obtain a conviction would be considered a due process violation under the Fourteenth Amendment.74The Fourteenth Amendment applied the due process requirements of the Fifth Amendment to the States. Therefore, the States must afford the same protections as the Federal Government. Similarly, a violation of due process would also occur where the State did not solicit the false evidence but failed to correct it when it appeared.

The scientific invalidity of forensic practices has only been reaffirmed since the publication of the 2016 PCAST report. Time and time again prosecutors have either dismissed the fact that forensics do not meet the rigorous standards of real science, or they have remained willfully ignorant. Under Daubert, a judge decides the admissibility of evidence and whether it meets the criteria of scientific validity. However, judges today routinely admit evidence that fails to meet the validity standards set by the scientific community—implicating two specific due process concerns in light of the Court’s holding in Perry.    

Perhaps the justification for refusing to accept the PCAST report is because it will essentially stall all forensic science for months, if not years. To bring just fingerprint analysis itself up to scientific standards would not only require extreme methodological changes to avoid human error, it would require many large scale studies to determine error rate, population frequency, and a universal procedure for analysts to follow. Such extreme reforms would be akin to pulling the tires off of a moving truck. If scientific validity differs between the scientific and legal community, solutions may include explaining the difference to jurors or even to completely avoid using the term scientific. Unfortunately, that genie is already out of the bottle. The fusion of science and forensics in the general public cannot be undone. Therefore, the only way to address this is to challenge forensic evidence as a due process violation on a case-by-case basis until it is reflected in the case law or legislatures change the rules of evidence or bury Daubert.

IV. Conclusion

The horrifying reality is that forensic analysts will testify that their methods are scientifically valid when they are not. Prosecutors will proclaim the unbiased and accurate nature of the forensic sciences they present but, in reality, they are utilizing methods that even the most unreputable of science journals disapprove. How many people has the criminal justice system falsely convicted based on false science?75According to The National Registry of Exonerations there have been twenty-six exonerations based on “False or Misleading Forensic Evidence” so far. See Summary of Cases, The National Registry of Exonerations https://www.law.umich.edu/special/exoneration/Pages/about.aspx (follow hyperlink; click the “Browse Cases” Tab, click “Summary View”, and enter “False or Misleading Forensic Evidence” in the search field destination). Our society extolls that we no longer use inquisitions and witch-trials, yet our forensic sciences fail to abide the scientific method which retired those methods.


Cover Photo by Immo Wegmann on Unsplash

Author

  • Alex Goldstein is originally from Oakland, California. Alex's philosophy is that law review articles should try to overturn the status quo and upend some desks in the legal community.

References

  • 1
    509 U.S. 579 (1993).
  • 2
    Id. at 582-83.
  • 3
    Id. at 587-89.
  • 4
    Id.
  • 5
    Id. at 593.
  • 6
    For a comparison see generally Frye v. United States, 293 F. 1013 (D.C. Cir. 1923).
  • 7
    Kumho Tire Co. v. Carmichael, 526 U.S. 137, 144-45 (1999).
  • 8
    Id. at 145.
  • 9
    Id.
  • 10
    Id. at 145. The district court did concede the motion for reconsideration. On that motion, the court conceded that its application of Daubert was too strict. However, it affirmed the barring of the expert testimony based on other considerations beyond the four factors found in Daubert. Id.
  • 11
    Id. at 47; see also id. at 152.
  • 12
    Id. at 150-51. This heavily implies that they do not need to accept any.
  • 13
    See id. at 156.
  • 14
    Id. at 150-51.
  • 15
    Id. at 159 (Scalia, J., concurring in judgement).
  • 16
    Id. at 149.
  • 17
    Kumho Tire expanded Daubert to anyone who would testify in a forensic capacity, regardless of whether they are testifying to scientific knowledge or simply as an expert. Id. at 148.
  • 18
    President’s Council of Advisors on Science and Technology, Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods 2 (Sept. 2016), https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast_forensic_science_report_final.pdf.
  • 19
    Id. at 29. The report specifically mentioned experts’ claims that their findings were “100 percent certain;” have “zero,” “essentially zero,” “vanishingly small,” “negligible,” “minimal,” or “microscopic” error rate; or have a chance of error so remote as to be a “practical impossibility.” Id.
  • 20
    Kumho Tire Co. v. Carmichael, 526 U.S. 137, 150-51, 156 (1999).
  • 21
    President’s Council of Advisors on Science and Technology, supra note 17, at 3.
  • 22
    Id. at 29-30.
  • 23
    Id. at 5.
  • 24
    Id. at 31.
  • 25
    Id.
  • 26
    Id.
  • 27
    Id.
  • 28
    Id.
  • 29
    Id.
  • 30
    Fingerprints can be separated into basic groups by the pattern formed around the center of the print. Two different patterns would be nearly impossible to fail to distinguish, while two different prints that share the same basic structural pattern would appear more similar. As an example, a non-ambiguous comparison would be akin to comparing different fonts of two different letters. It is very difficult to mistake letters regardless of font.
  • 31
    Itiel E. Dror et al., When Emotions Get the Better of Us: The Effect of Contextual Top-down Processing on Matching Fingerprints, 19 Applied Cognitive Psych. 799, 806-09 (2005).
  • 32
    Itiel Dror et al., Cognitive Bias in Forensic Pathology Decisions, 66 J. Forensic Sci. 1751, 1754-55 (2021) (concluding that when presented with identical reports, forensic pathologists were more likely to consider the child’s death a homicide if they were white).
  • 33
    See generally U.S. Dep’t of Just. Off. of the Inspector Gen., A Review of the FBI’s Handling of the Brandon Mayfield Case (Mar. 2006).
  • 34
    Id. at 1-2.
  • 35
    Sarah Kershaw et al., Spain and U.S. at Odds on Mistaken Terror Arrest, N.Y. Times (June 5, 2004), https://www.nytimes.com/2004/06/05/us/spain-and-us-at-odds-on-mistaken-terror-arrest.html.
  • 36
    U.S. Dep’t of Just. Off. of the Inspector Gen., supra note 32, at 3.
  • 37
    Id. at 3-4, 7.
  • 38
    Id. at 101.
  • 39
    See Robin Mejia et al., Implementing Blind Proficiency Testing in Forensic Laboratories: Motivation, Obstacles, and Recommendations, 2 Forensic Sci. Int’l: Synergy 293, 294-98 (2020). A recent study found that independent examination of latent fingerprint results caught all false positives and a majority of false negatives. Bradford T. Ulery et al., Accuracy and Reliability of Forensic Latent Fingerprint Decisions, 108 Proc. Natl. Acad. Sci. USA 7733, 7733 (2011) (demonstrating that latent fingerprint analysis can become more reliable than it currently is).
  • 40
    See President’s Council of Advisors on Science and Technology, supra note 17.
  • 41
    President’s Council of Advisors on Science and Technology, supra note 17, at 46; see also Itiel Dror, The Error in “Error Rate”; Why Error Rates Are So Needed, Yet So Elusive, 65 J. Forensic Sci. 1034, 1034-35 (2020) (stating knowledge of error rates is a fundamental aspect of good science).
  • 42
    See id. at 104; see also Robin Mejia et al., Implementing Blind Proficiency Testing in Forensic Laboratories: Motivation, Obstacles, and Recommendations, 2 Forensic Sci. Int’l: Synergy 297 (2020).
  • 43
    President’s Council of Advisors on Science and Technology, supra note 17, at 46; see also Itiel Dror, The Error in “Error Rate”; Why Error Rates Are So Needed, Yet So Elusive, 65 J. Forensic Sci. 1034, 1034-35 (2020) (stating knowledge of error rates is a fundamental aspect of good science).
  • 44
    President’s Council of Advisors on Science and Technology, supra note 17, at 50-51.
  • 45
    Jonathan J. Koehler, Fingerprint Error Rates and Proficiency Tests: What They are and Why They Matter, 59 Hastings L. J. 1077, 1079 (2008).
  • 46
    Id.
  • 47
    Supra note 46.
  • 48
    See Sandra Guerra Thompson & Nicole Bremmer Cásarez, Solving Daubert’s Dilemma for the Forensic Sciences Through Blind Testings, 57 Hous. L. Rev. 617, 632 (2020) (“The lack of statistical support for most of the forensic disciplines continues to pose a conundrum for the courts: If the error rate is not zero, what is it?”).
  • 49
    President’s Council of Advisors on Science and Technology, supra note 17, at 85.
  • 50
    Jonathan J. Koehler & Shiquan Liu, Fingerprint Error Rate on Close Non-matches, 66 J. Forensic Sci. 129, 130 (2021).
  • 51
    Jennifer Mnookin et al., Error Rates for Latent Fingerprinting as a Function of Visual Complexity and Cognitive Difficulty, Nat’l Crim. Just. Reference Serv. 32 (May 2016), https://www.ojp.gov/pdffiles1/nij/grants/249890.pdf; see also Simon A. Cole, More than Zero: Accounting for Error in Latent Fingerprint Identification, 95 J. Crim. L. & Criminology 985, 1073 (2005) (finding examiner false positive rate amongst several studies to be 5.5%).
  • 52
    President’s Council of Advisors on Science and Technology, supra note 17, at 56-57.
  • 53
    Id. at 57.
  • 54
    Id. at 61.
  • 55
    Id. at 56.
  • 56
    Id. at 85.
  • 57
    Id. at 67-123.
  • 58
    Daubert v. Merrell Dow Pharmaceuticals Inc., 509 U.S. 579, 588-89 (1993) (comparing the “liberal thrust” of FRE 702 to Frye’s “rigid” and austere General Acceptance Standard (citing Beech Aircraft Corp. v. Rainey, 488 U.S. 153, 169 (1988))).
  • 59
    Perry, 565 U.S. at 232.
  • 60
    Id. at 234-35.
  • 61
    Id. at 235.
  • 62
    Id. at 237.
  • 63
    Id. at 237. The Court uses the same language of “ordered liberty” and that other case’s use of “fundamental justice” as it did Glucksberg. See Washington v. Glucksberg, 521 U.S. 702 (1997). This makes it irrefutably apparent that both instances under Perry qualify as a violation of due process.
  • 64
    Torres v. Roden, No. 15-13598-LTS, 2017 U.S. Dist. Lexis. 186817, at *15-16 (D. Mass. Nov. 13, 2017).
  • 65
    Id. at 16 (citing Burt v. Titlow, 571 U.S. 12, 18 (2013)).
  • 66
    Id. at 17 (citing Coningford v. Rhode Island, 640 F.3d at 485 (1st Cir. 2011)).
  • 67
    Lee v. Glunt, 667 F.3d 397, 400 (2012).
  • 68
    Id. at 402.
  • 69
    Id. at 403. Note that is just the weighing analysis under FRE 403. See Perry v. New Hampshire, 565 U.S. 228, 232 (2012) and Fed. R. Evid. 403 (these factors are unfair prejudice, confusing the issues, misleading the jury, undue delay, wasting time, or needlessly presenting cumulative evidence).
  • 70
    Id. at 407-08. The court concluded by saying the prejudice/probative balancing test combined with his claim of actual innocence. Id. at 400. This actual innocence is part of the habeas corpus requirements but is not required in the Federal Rules of Evidence regarding admissibility and is therefore not applicable to admissibility pre-conviction.
  • 71
    Perry, 565 U.S. at 237.
  • 72
    360 U.S. 264 (1959).
  • 73
    Id. at 266.
  • 74
    The Fourteenth Amendment applied the due process requirements of the Fifth Amendment to the States. Therefore, the States must afford the same protections as the Federal Government.
  • 75
    According to The National Registry of Exonerations there have been twenty-six exonerations based on “False or Misleading Forensic Evidence” so far. See Summary of Cases, The National Registry of Exonerations https://www.law.umich.edu/special/exoneration/Pages/about.aspx (follow hyperlink; click the “Browse Cases” Tab, click “Summary View”, and enter “False or Misleading Forensic Evidence” in the search field destination).

Up ↑

Skip to content