2014
DOI: 10.1016/j.rboe.2014.03.016
|View full text |Cite
|
Sign up to set email alerts
|

Do computed tomography and its 3D reconstruction increase the reproducibility of classifications of fractures of the proximal extremity of the humerus?

Abstract: Objectiveto determine whether 3D reconstruction images from computed tomography (CT) increase the inter and intraobserver agreement of the Neer and Arbeitsgemeinschaft für Osteosynthesefragen (AO) classification systems.Methodsradiographic images and tomographic images with 3D reconstruction were obtained in three shoulder positions and were analyzed on two occasions by four independent observers.Resultsthe radiographic evaluation demonstrated that using CT improved the inter and intraobserver agreement of the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
16
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(17 citation statements)
references
References 20 publications
1
16
0
Order By: Relevance
“…Tomographies showed higher agreement when classified according to AO and compared with radiographs, A1 0.535 kappa, C2 0.479, C3 0.311 with a 0.277 general mean, as shown in Tables 3 and 4. The values found were similar to the results of Matsushigue et al 9 , in which a 0.25 kappa value was obtained for radiographs and a 0.36 kappa for tomographies. The values were higher than in the analysis by Majed et al 10 , which showed weak inter-observer reliability, with a 0.11 kappa.…”
Section: Discussionsupporting
confidence: 91%
See 2 more Smart Citations
“…Tomographies showed higher agreement when classified according to AO and compared with radiographs, A1 0.535 kappa, C2 0.479, C3 0.311 with a 0.277 general mean, as shown in Tables 3 and 4. The values found were similar to the results of Matsushigue et al 9 , in which a 0.25 kappa value was obtained for radiographs and a 0.36 kappa for tomographies. The values were higher than in the analysis by Majed et al 10 , which showed weak inter-observer reliability, with a 0.11 kappa.…”
Section: Discussionsupporting
confidence: 91%
“…The AO is one of the most complete classification system, however, its intra-and inter-observer reproducibility has reduced. 1,9,11 In our study, we evaluated 54 patients with proximal humerus fracture, whose initial evaluation was performed by radiography and tomography with 3D reconstruction. In the evaluation of the inter-observer results regarding radiographs according to Neer classification, we observed that kappa agreement ranged between 0.083 (analysis with fractures classified into 3 parts), 0.204 (4 parts) and 0.275 (2 parts), with 0.178 general kappa, with p < 0.001.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…26 , 31 Owing to the round structure, it is challenging to appreciate angulation or displacement of the humeral head. 27 There is likely more disagreement about complex fractures, 4 , 5 , 23 , 34 where it becomes increasingly difficult to assess displacement. 4 However, the amount of displacement helps determine management and is integral to PHF classification systems.…”
Section: Discussionmentioning
confidence: 99%
“…26,31 Owing to the round structure, it is challenging to appreciate angulation or displacement of the humeral head. 27 There is likely more disagreement about complex fractures, 4,5,23,34 where it 2D, two-dimensional; 3D, three-dimensional; CI, confidence interval; CT, computed tomography. * Agreement has been defined using the Landis and Koch criteria (Table I).…”
Section: Discussionmentioning
confidence: 99%