2002
DOI: 10.1007/s00264-002-0369-x
|View full text |Cite
|
Sign up to set email alerts
|

Low agreement among 24 doctors using the Neer-classification; only moderate agreement on displacement, even between specialists

Abstract: Twenty-four orthopaedic surgeons classified 42 pairs of radiographs according to the Neer system for proximal humeral fractures. Mean kappa value for interobserver agreement was 0.27 (95% CI 0.26-0.28) with no clinically significant difference between orthopaedic residents (n=9), fellows (n=6) and specialists (n=9). Mean kappa for agreement of displacement versus nondisplacement was 0.41 (95% CI 0.39-0.43) overall, and 0.50 (95% CI 0.45-0.56) within the specialist group. The agreement found in our study is uns… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2008
2008
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 30 publications
(3 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…Moreover, only patients with 2-part PHFs involving the surgical neck were included in the trial. Previously, there has been disagreement over recognizing different fracture categories [29]. In our previous publication, however, we found substantial intra- and interobserver agreement in re-categorized Neer classification [30].…”
Section: Discussionmentioning
confidence: 60%
“…Moreover, only patients with 2-part PHFs involving the surgical neck were included in the trial. Previously, there has been disagreement over recognizing different fracture categories [29]. In our previous publication, however, we found substantial intra- and interobserver agreement in re-categorized Neer classification [30].…”
Section: Discussionmentioning
confidence: 60%
“…Despite the use of CT imaging in some studies, it has not been shown to improve the results uniformly [2, 1113, 19]. The kappa values obtained from these studies improve only slightly or not at all, when the classification systems were simplified or reduced to two options, displaced or undisplaced [1, 14, 20]. However, it has been suggested in one study that the use of CT-based stereo visualisation may substantially improve classification reliability, which has also been seen in tibial plateau fractures [21, 22].…”
Section: Discussionmentioning
confidence: 99%
“…On the other hand, a few studies have supported the role of experience. Brorson et al showed significantly higher confidence intervals in specialists when compared to residents and fellows, and although not explicitly stated as a study aim, Bernstein et al showed higher absolute kappa values among attending surgeons when compared to residents [ 4 , 7 , 8 ]. As a response to the challenge of the 4-type classification system, Neer commented that inter-observer variability is likely the combination of "suboptimal quality of current imaging and inexperienced interpreters [ 22 ]."…”
Section: Discussionmentioning
confidence: 99%