1979
DOI: 10.1177/00220345790580021101
|View full text |Cite
|
Sign up to set email alerts
|

Inter-examiner Reliability in Caries Trials

Abstract: A statistical model is given for representing the several components of variability present in measurements (e.g., DMFS scores) given by examiners to patients. Methods for making inferences about the intraclass correlation coefficient of reliability are presented and illustrated on a real set of data. The proper analysis of data from a reliability study is shown to depend on the planned design and analysis of the clinical or field trial to be conducted following the reliability trial.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
33
0
2

Year Published

1985
1985
2012
2012

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 77 publications
(36 citation statements)
references
References 9 publications
1
33
0
2
Order By: Relevance
“…Kappa values range from -1 to +1; values between -1 and 0 indicate that the observed agreement was lower than what was randomly expected, 0 indicates the random agreement level, and +1 indicates total agreement. 17 In general, kappa values of less than 0. 5 and another four presented x-rays produced with the forearm immobilized in plaster.…”
Section: Methodsmentioning
confidence: 98%
See 1 more Smart Citation
“…Kappa values range from -1 to +1; values between -1 and 0 indicate that the observed agreement was lower than what was randomly expected, 0 indicates the random agreement level, and +1 indicates total agreement. 17 In general, kappa values of less than 0. 5 and another four presented x-rays produced with the forearm immobilized in plaster.…”
Section: Methodsmentioning
confidence: 98%
“…Data were collected on spreadsheets and the kappa (κ) coefficient was used to assess agreements. k was applied using the method proposed by Fleiss et al 17 , and the random expected agreement calculation described by Scott 18 and Cohen 19 was also used. The latter two methods enable calculation of agreements for multiple (more than two) observers with regard to evaluations of nominal variances.…”
Section: Methodsmentioning
confidence: 99%
“…Previous studies 3,6,7,24 have focused on the interexaminer agreement for evaluation of caries, signs and symptoms of periodontal disease and radiographic examinations. All these studies have employed relatively objective data, such as pocket probing depth, bone loss and presence or absence of caries, whereas standardization of CR manipulation is based on less objective data.…”
Section: Discussionmentioning
confidence: 99%
“…The kappa values can range from -1 to +1; values between -1 and 0 indicate that the concordance observed was less than what would be expected by chance; 0 indicates a concordance level that would be expected by chance; and +1 indicates total concordance. In general, kappa values lower than 0.5 are considered to be unsatisfactory; values between 0.5 and 0.75 are considered to be satisfactory and adequate; and values greater than 0.75 are considered to be excellent (25)(26)(27) . The parameters for normality, which were used in conformity with the literature, are described in Table 1.…”
Section: Methodsmentioning
confidence: 99%