top of page

Scientific Validity of Feature Comparison

Updated: Aug 3, 2022

With respect to error rates in evidentiary breath testing, please see the blog entry "Error Rates and Drift in Precision".


In September 2016, the President's Council of Advisors on Science and Technology delivered its report: Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature Comparison Methods. We should pay close attention to this Report in Canada when we consider, in Court, the scientific reliability of DNA analysis, bitemark analysis, latent fingerprint analysis, firearms analysis, footwear analysis, and hair analysis.

In their covering letter to President Obama the authors wrote the following:

"PCAST concluded that there are two important gaps: (1) the need for clarity about the scientific standards for the validity and reliability of forensic methods and (2) the need to evaluate specific forensic methods to determine whether they have been scientifically established to be valid and reliable. Our study aimed to help close these gaps for a number of forensic “feature-comparison”methods—specifically, methods for comparing DNA samples, bitemarks, latent fingerprints, firearm marks, footwear, and hair."

Feature comparison methods are (page 1):

"“feature-comparison” methods—that is, methods that attempt to determine whether an evidentiary sample (e.g., from a crime scene) is or is not associated with a potential “source” sample (e.g., from a suspect), based on the presence of similar patterns, impressions,

or other features in the sample and the source. Examples of such methods include the analysis of DNA, hair, latent fingerprints, firearms and spent ammunition, toolmarks and bitemarks, shoe prints and tire tracks, and handwriting."

Feature-comparison methods are used a lot in Canadian Courts. Police collect evidence and attempt to match DNA in sexual assault cases, latent fingerprints of juveniles in break & enter cases, firearms in weapons charges, and tire tracks in dangerous driving charges.

The President's Council discovered from literature review that there were serious problems with feature-comparison (page 16):

"The questions that DNA analysis had raised about the scientific validity of traditional forensic disciplines and testimony based on them led, naturally, to increased efforts to test empirically the reliability of the methods that those disciplines employed."

"...reviews have found that expert witnesses have often overstated the probative value of their evidence, going far beyond what the relevant science can justify. Examiners have sometimes testified, for example, that their conclusions are “100 percent certain;” or have “zero,” “essentially zero,” or “negligible,” error rate. As many reviews—including the highly regarded 2009 National Research Council study—have noted, however, such statements are not scientifically defensible: all laboratory tests and feature-comparison analyses have non-zero error rates."

There are and will be current and newly developed forensic feature-comparison technologies. All of them require assessment of foundational validity. In other words, is this type of forensic matching reliable enough that we should be using it in a Youth Court or a criminal adult Court? The Council recommended independent and ongoing evaluations and assessments of the foundational validity of feature-comparison methods. Such evaluations need to be done by an agency or agencies with no stake in the outcome. The evaluations of foundational validity need to be independent of police. They need to be based on real empirical science - assessment of error rates.

Literature reviews studied by the Council indicated error rates as high as:

  1. 11% error rate in hair comparison analysis.

  2. No appropriate empirical studies to support the foundational validity of footwear analysis to associate shoeprints with particular shoes.

  3. Bitemarks do not meet the scientific standards for foundational validity, and are far from meeting such standards.

  4. Error rates in firearms analysis of 1 in 19 or 1 in 46.

  5. False positives in fingerprint comparison of 1 in 18 and 1 in 306.

The Council suggests that error rates should be presented when matters go to Court and the testimony should not over-state what has been empirically established. The Judge or Jury should hear about the error rates associated with the particular type of feature comparison.

Lawyers need to find out through disclosure or their own literature review:

  1. Error rate studies, including those conducted on case-like samples.

  2. Studies that demonstrate scientific validity.

  3. The Proficiency tests and test results for the examiner.

  4. Audits documenting errors or anomalies in the laboratory

  5. Accreditation

John Oliver did an entertaining review of this report "Last Week Tonight John Oliver HBO 10-01-17:

Comments


If you are a member of the public, please don't attempt to use what you see or read at this site in Court. It is not evidence. The author is not a scientist. The author has a great deal of experience in cross-examining scientists about these issues, but the author is not a scientist. Hire a criminal lawyer in private practice in Ontario. Your lawyer can retain an expert. The author is a retired lawyer, not a lawyer in private practice. Read the statement of the purpose of this web site below.

© 2025 Allbiss Lawdata Ltd.

This site has been built by Allbiss Lawdata Ltd. All rights reserved. This is not a government web site.

For more information respecting this database or to report misuse contact: Allbiss Lawdata Ltd., Mississauga, Ontario, Canada, 905-273-3322. The author and the participants make no representation or warranty  whatsoever as to the authenticity and reliability of the information contained herein.  WARNING: All information contained herein is provided  for the purpose of discussion and peer review only and should not be construed as formal legal advice. The authors disclaim any and all liability resulting from reliance upon such information. You are strongly encouraged to seek professional legal advice before relying upon any of the information contained herein. Legal advice should be sought directly from a properly retained lawyer or attorney. 

WARNING: Please do not attempt to use any text, image, or video that you see on this site in Court. These comments, images, and videos are NOT EVIDENCE. The Courts will need to hear evidence from a properly qualified expert. The author is not a scientist. The author is not an expert. These pages exist to promote discussion among defence lawyers.

Intoxilyzer®  is a registered trademark of CMI, Inc. The Intoxilyzer® 5000C is an "approved instrument" in Canada.

Breathalyzer® is a registered trademark of Draeger Safety, Inc., Breathalyzer Division. The owner of the trademark is Robert F. Borkenstein and Draeger Safety, Inc. has leased the exclusive rights of use from him. The Breathalyzer® 900 and Breathalyzer® 900A were "approved instruments" in Canada.

Alcotest® is a registered trademark of Draeger Safety, Inc. The Alcotest® 7410 GLC and 6810 are each an "approved screening device" in Canada.

Datamaster®  is a registered trademark of National Patent Analytical Systems, Inc.  The BAC Datamaster® C  is an "approved instrument" in Canada.

bottom of page