1989
DOI: 10.1177/001316448904900407
|View full text |Cite
|
Sign up to set email alerts
|

Interjudge Agreement and the Maximum Value of Kappa

Abstract: The observed degree of agreement between judges is commonly summarized using Cohen's (1960) kappa. Previous research has related values of kappa to the marginal distributions of the agreement matrix. This manuscript provides an approach for calculating maximum values of kappa as a function of observed agreement proportions between judges. Solutions are provided separately for matrices of size 2 x 2, 3 x 3, 4 x 4, and k X k; plots are provided for the 2 x 2, 3 x 3, and 4 x 4 matrices.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0
2

Year Published

1994
1994
2019
2019

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 63 publications
(32 citation statements)
references
References 5 publications
0
30
0
2
Order By: Relevance
“…The corresponding maximum possible values for the median kappa values (Umesh et al, 1989) were .64 and .62. As mentioned, kappa is a measure of pairwise agreement.…”
Section: Resultsmentioning
confidence: 95%
“…The corresponding maximum possible values for the median kappa values (Umesh et al, 1989) were .64 and .62. As mentioned, kappa is a measure of pairwise agreement.…”
Section: Resultsmentioning
confidence: 95%
“…For clinical purposes, Ͼ 0.6 is "substantial" and "clinically useful," whereas Ͼ 0.8 is "almost perfect." 14 is the most widely used agreement statistic in biomedical studies 15 and has current applications in pediatrics and pediatric cardiology. 16 -18 is not interpretable if consensus prevalence (CP; the proportion of cases for which the examiners agree that the target finding is present) ϭ 0.0 or 1.00 or if the 2 examiners' diagnostic biases are significantly different from one another by McNemar's test.…”
Section: Measures and Statistical Analysismentioning
confidence: 99%
“…The maximum values of kappas for inter-judge agreement were also estimated (Umesh, et al, 1989). Peer-reported and teacher-reported evaluations showed the highest concordance levels for both ages: 0.30 (n = 819; kappa max = 0.54) for 11-year-olds, and 0.31 (n = 815; kappa max = 0.58) for 12-year-olds.…”
Section: Concordance Of Evaluationsmentioning
confidence: 98%