Error Rate Resources
Hamby, Brundage, Petraco and Thorpe, "A worldwide Study of Bullets Fired from 10 Consecutively Rifled 9MM RUGER Pistol Barrels--Analysis of Examiner Error Rate", J Forensic Sci, 2018.
Song, Vorburger, Chu, Yen, Soons, Ott and Zhang, "Estimating Error Rates for Firearm Evidence Identifications in Forensic Science", J Forensic Sci, 2018.
Smith, Smith and Snipes, "A Validation Study of Bullet and Cartridge Case comparisons Using Samples Representative of Actual Casework", J Forensic Sci, 2016.
Kaye, "PCAST and the Ames Bullet Cartridge Study: Will the Real Error Rates Please Stand Up?", Forensic Science, Statistics & the Law, for-sci-law.blogspot.com, Nov 1, 2016.
Baldwin, Morris and Zamrow, "A Study of False-Positive and False-Negative Error Rates in Cartridge Case Comparisons", Presented at 45th Association of Firearm & Tool Mark Examiners Training Seminar, Seattle WA, May 15, 2014.
Conducted empirical study on fired cartridge cases designed to measure individual examiner false identifications and false eliminations when comparing an unknown specimen to a collection of three known fired cartridge cases. These comparison results were not subjected to respective laboratory QA verification or peer review processes. Two hundred eighteen (218) firearm examiners, who were AFTE members or worked at accredited forensic laboratories, responded to this study with the following results using a 95% confidence interval:
- False Positives= 1.01% (Note: 20 of 22 false identifications were made by five examiners)
- False negatives= 0.367%
- Maximum Likelihood Estimator= 0.939%
Presenter, Dr. Baldwin, mentioned that true laboratory error rates are likely even lower as quality assurance measures in place for casework were not utilized during this study.
Stroman, "Emperically Determined Frequency of Error in Cartridge Case Examinations Using a Declared Double-Blind Format", AFTE Journal Spring 2014; 46(2):157-175.
Murphy, D., "CTS Error Rates, 1992-2005 Firearms/Toolmarks", Presented at the 41st Association of Firearm and Tool Mark Examiners (AFTE) Training Seminar, Henderson, NV, May 5, 2010.
Murdock and Grzybowski - Firearm/Toolmark Identification- Meeting the Daubert Challenge, AFTE Journal Winter 1998; 30(1):3-14.
Firearm and Toolmark Identification, Biasotti and Murdock, Chapter 23, MODERN SCIENTIFIC EVIDENCE: THE LAW AND SCIENCE OF EXPERT TESTIMONY, By: David L. Faigman, David H. Kaye, Michael J. Saks & Joseph Sanders.
Peterson J.L., Markham P.N., Crime laboratory proficiency testing results, 1978-1991, I: Identification and classification of physical evidence. Journal of Forensic Science. 1995 Nov;40(6):994-1008.
Abstract: The proficiency testing of crime laboratories began in the mid-1970s and presently assumes an important role in quality assurance programs within most forensic laboratories. This article reviews the origins and early results of this testing program and also examines the progress of proficiency testing in allied scientific fields. Beginning in 1978, a fee-based crime laboratory proficiency testing program was launched and has grown to its present level involving almost 400 laboratories worldwide. This is the first of two articles that review the objectives, limitations and results of this testing from 1978 through 1991. Part I reviews the success of laboratories in the identification and classification of common evidence types: controlled substances, flammables, explosives, fibers, bloodstains, and hairs. Laboratories enjoy a high degree of success in identifying drugs and classifying (typing) bloodstains. They are moderately successful in identifying flammables, explosives, and fibers. Animal hair identification and human hair body location results are troublesome. The second paper will review the proficiency of crime laboratories in determining if two or more evidentiary samples shared a common origin.
Peterson J.L., Markham P.N., Crime laboratory proficiency testing results, 1978-1991, II: Resolving questions of common origin. Journal of Forensic Science. 1995 Nov;40(6):1009-1029.
Abstract: A preceding article has examined the origins of crime laboratory proficiency testing and the performance of laboratories in the identification and classification of common types of physical evidence. Part II reviews laboratory proficiency in determining if two or more evidence samples shared a common source. Parts I and II together review the results of 175 separate tests issued to crime laboratories over the period 1978 to 1991. Laboratories perform best in determining the origin of finger and palm prints, metals, firearms (bullets and cartridge cases), and footwear. Laboratories have moderate success in determining the source of bloodstains, questioned documents, toolmarks, and hair. A final category is of greater concern and includes those evidence categories where 10% or more of results disagree with manufacturers regarding the source of samples. This latter group includes paint, glass, fibers, and body fluid mixtures. The article concludes with a comparison of current findings with earlier LEAA study results, and a discussion of judicial and policy implications.