2015
DOI: 10.1016/j.jse.2015.02.011
|View full text |Cite
|
Sign up to set email alerts
|

Intraobserver and interobserver agreement of Goutallier classification applied to magnetic resonance images

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
24
1
2

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 55 publications
(31 citation statements)
references
References 43 publications
3
24
1
2
Order By: Relevance
“…A similar method using MR arthrography was reported in another study using an oblique sagittal T1-weighted image selected at the level of the coracoid base [27]. More recently, Schiefer et al [29] reported high intraobserver and interobserver agreement when investigators had full access to the examinations without specifying a definitive plane or image. Fuchs et al [7] reported interobserver agreement between two raters, a musculoskeletal radiologist and a radiology fellow.…”
Section: Classificationmentioning
confidence: 84%
See 1 more Smart Citation
“…A similar method using MR arthrography was reported in another study using an oblique sagittal T1-weighted image selected at the level of the coracoid base [27]. More recently, Schiefer et al [29] reported high intraobserver and interobserver agreement when investigators had full access to the examinations without specifying a definitive plane or image. Fuchs et al [7] reported interobserver agreement between two raters, a musculoskeletal radiologist and a radiology fellow.…”
Section: Classificationmentioning
confidence: 84%
“…Another group [29] reported good interobserver and interobserver agreement among six raters using the MRI adaptation of the Goutallier et al scale, including three shoulder surgeons and three musculoskeletal radiologists. The surgeons showed higher intraobserver agreement and interobserver agreement compared with radiologists.…”
Section: Magnetic Resonance Imagingmentioning
confidence: 96%
“…However, Slabaugh et al 35 reported kappa values of 0.43 and 0.56 for inter-and intra-observer agreement, respectively, in a group of 30 experienced shoulder surgeons. Schiefer et al 36 demonstrated that inter-observer agreement was higher among three orthopaedic surgeons (kappa scores 0.72 to 0.82) than among three radiologists (kappa scores 0.61 to 0.66). Our inclusion of the attending orthopaedic surgeon as the primary observer for statistical calculations on Goutallier grades was based on maximizing clinical relevance and also represents the most reproducible observations based on the previous literature.…”
Section: Discussionmentioning
confidence: 99%
“…4). Diese Einteilung wurde ursprünglich für die CT-Diagnostik beschrieben, erlaubt jedoch auch im MRT eine valide Beurteilung [6,7]. Hierzu müssen bei der Fragestellung nach der Muskelqualität die parasagittalen T1-Sequenzen weit genug nach medial "gefahren" werden, um die Muskelbäuche abzubilden (▶ Abb.…”
Section: Rotatorenmanschette Und Outlet-impingementunclassified