2023
DOI: 10.1111/1556-4029.15257
|View full text |Cite
|
Sign up to set email alerts
|

Commentary on: Monson KL, Smith ED, Peters EM. Accuracy of comparison decisions by forensic firearms examiners. J Forensic Sci. 2022;68(1):86–100. https://doi.org/10.1111/1556‐4029.15152.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 21 publications
(32 reference statements)
0
0
0
Order By: Relevance
“…But because we were constrained by the choices of Duez et al's authors and the information they made publicly available-their sample selection, the demographic information about participants they provided, and their decision to release only photos of comparisons items-we could not manipulate and control these variables. Such research, however, is critical given that existing validation studies show an, at best, imperfect correlation between training/experience and superior performance: none have successfully distinguished poor performing/misidentification-prone examiners from their more accurate peers based on their source of training, years of practice, or the accreditation status of their employing laboratories[14,62,106]. Thus, while we encourage the field of firearms examination to explore the role of expertise, as other fields such as latent print comparison[44][45][46][47], document examination[50,107], and facial recognition[48,49] have attempted to do, our study does not close that gap in existing literature.Beyond conducting studies designed to assess the role of expertise (if any) in the comparison of bullets and cartridge cases, our research suggests a serious need to (1) design studies going forward that include challenging, close non-match comparisons including those with subclass characteristics, and (2) reevaluate the difficulty and complexity of samples from existing validation efforts.…”
mentioning
confidence: 99%
“…But because we were constrained by the choices of Duez et al's authors and the information they made publicly available-their sample selection, the demographic information about participants they provided, and their decision to release only photos of comparisons items-we could not manipulate and control these variables. Such research, however, is critical given that existing validation studies show an, at best, imperfect correlation between training/experience and superior performance: none have successfully distinguished poor performing/misidentification-prone examiners from their more accurate peers based on their source of training, years of practice, or the accreditation status of their employing laboratories[14,62,106]. Thus, while we encourage the field of firearms examination to explore the role of expertise, as other fields such as latent print comparison[44][45][46][47], document examination[50,107], and facial recognition[48,49] have attempted to do, our study does not close that gap in existing literature.Beyond conducting studies designed to assess the role of expertise (if any) in the comparison of bullets and cartridge cases, our research suggests a serious need to (1) design studies going forward that include challenging, close non-match comparisons including those with subclass characteristics, and (2) reevaluate the difficulty and complexity of samples from existing validation efforts.…”
mentioning
confidence: 99%