2007
DOI: 10.1016/j.jspi.2006.03.002
|View full text |Cite
|
Sign up to set email alerts
|

Confidence intervals on intraclass correlation coefficients in a balanced two-factor random design

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…The resulting intraclass and interclass correlation coefficients (ICCs) are equivalent to a weighted k coefficient. Two-sided 95% CIs were obtained using the methods discussed by Gilder et al 22 Strength of agreement was evaluated according to the criteria of Landis and Koch whereby ICCs of <0.00, 0.00 to 0.20, 0.21 to 0.40, 0.41 to 0.60, 0.61 to 0.80, and 0.81 to 1.00 indicate poor, slight, fair, moderate, substantial, and almost perfect agreement, respectively. 23 The sample size calculation for the agreement study was based on the analysis of variance model.…”
Section: Methodsmentioning
confidence: 99%
“…The resulting intraclass and interclass correlation coefficients (ICCs) are equivalent to a weighted k coefficient. Two-sided 95% CIs were obtained using the methods discussed by Gilder et al 22 Strength of agreement was evaluated according to the criteria of Landis and Koch whereby ICCs of <0.00, 0.00 to 0.20, 0.21 to 0.40, 0.41 to 0.60, 0.61 to 0.80, and 0.81 to 1.00 indicate poor, slight, fair, moderate, substantial, and almost perfect agreement, respectively. 23 The sample size calculation for the agreement study was based on the analysis of variance model.…”
Section: Methodsmentioning
confidence: 99%
“…To determine the inter-rater and intra-rater reliability for both phases combined, the method of Gilder et al 33 was used. For the interpretation of the κ statistic, the cutoff values used in this study were: κ=0.21–0.40—fair agreement; κ=0.41–0.60—moderate agreement; κ=0.61–0.80—substantial agreement; κ=0.81–1.00, almost perfect agreement 34.…”
Section: Methodsmentioning
confidence: 99%
“…Inter-rater and intra-rater agreement with 95% confidence intervals were estimated simultaneously for the aforementioned volume, location, and shape variables using Gilder's method [19]. Gilder's method (aka modified large-sample approach) was used instead of other popular methods (e.g., DICE coefficients) because of more accurate coverage for both inter-and intra-rater reliability.…”
Section: Discussionmentioning
confidence: 99%