2002
DOI: 10.1302/0301-620x.84b1.11660
|View full text |Cite
|
Sign up to set email alerts
|

Observer reliability in the arthoscopic classification of osteoarthritis of the knee

Abstract: We studied 19 videotaped knee arthroscopies in 19 patients with mild to moderate osteoarthritis (OA) of the knee in order to compare the intraobserver and interobserver reliability and the patterns of disagreement between four orthopaedic surgeons. The classifications of OA of Collins, Outerbridge and the French Society of Arthroscopy were used. Intraobserver and interobserver agreements using kappa measures were 0.42 to 0.66 and 0.43 to 0.49, respectively. Only 6% to 8% of paired intraobserver classifications… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

6
47
0
2

Year Published

2004
2004
2015
2015

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 103 publications
(55 citation statements)
references
References 13 publications
6
47
0
2
Order By: Relevance
“…2,4,7,13,29 Marx et al 29 demonstrated good inter-observer reliability with arthroscopic classification of articular cartilage lesions. Six experienced surgeons based on video analysis classified thirty-one different lesions.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…2,4,7,13,29 Marx et al 29 demonstrated good inter-observer reliability with arthroscopic classification of articular cartilage lesions. Six experienced surgeons based on video analysis classified thirty-one different lesions.…”
Section: Discussionmentioning
confidence: 99%
“…Brismar et al 4 studied both intra-observer and inter-observer reliability of arthroscopic classification of mild to moderate osteoarthritis using video assessment. Four different surgeons reviewed 19 different videotaped knee arthroscopies twice, classifying the observed arthritis using the Outerbridge, Collins, and French Society of Arthroscopy measures.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…A poor inter-observer agreement at arthroscopy could also impair the validity of our study, where several orthopedic surgeons were involved. Studies about inter-observer agreement at arthroscopy demonstrate fair and moderate inter-observer agreement, particularly for the patellofemoral cartilage [37,38], but also sufficient reproducibility [39]. In this context, it has to be mentioned that a comparable intra-operative cartilage assessment is an important objective within most orthopedic departments, especially regarding patients with osteoarthritis.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, the use of arthroscopic grading as a reference standard should be regarded with caution. In the literature, inter-observer agreement at arthroscopy demonstrates sufficient reproducibility [33], but poor results for cartilage grading [34]. On the other hand, a study by Bachmann et al [35] yielded an exact agreement between arthroscopic and histopathologic grading in 287 of 300 cases.…”
Section: Discussionmentioning
confidence: 99%