Wiley StatsRef: Statistics Reference Online 2018
DOI: 10.1002/9781118445112.stat01671.pub2
|View full text |Cite
|
Sign up to set email alerts
|

A Review on Assessing Agreement

Abstract: Measurements serve as the basis for evaluation in almost all scientific disciplines, especially in physical sciences, medical studies, and health care. Issues related to reliable and accurate measurement have evolved over many decades. Requiring a measurement to be identical to the truth is sometimes impractical or impossible either because (i) the truth is simply not available or is measured with some error or (ii) some tolerable error is acceptable. Concepts of agreement, including reproducibility or reliab… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 75 publications
0
5
0
Order By: Relevance
“…It is therefore not a "pure agreement index" [41]. The CCC has the disadvantage of being heavily dependent on the between-subject variability (and in our case also on the between-activity variability) and would therefore attain a high value for a population with substantial heterogeneity between subjects or activities even though the agreement within subjects might be low [2,11,12]. Similarly, if both the between subject and between-activity variances are very low, then the CCC is unlikely to attain a high value even if agreement within devices is reasonable.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…It is therefore not a "pure agreement index" [41]. The CCC has the disadvantage of being heavily dependent on the between-subject variability (and in our case also on the between-activity variability) and would therefore attain a high value for a population with substantial heterogeneity between subjects or activities even though the agreement within subjects might be low [2,11,12]. Similarly, if both the between subject and between-activity variances are very low, then the CCC is unlikely to attain a high value even if agreement within devices is reasonable.…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, as for the intraclass correlation coefficient (ICC), it is not related to the actual scale of measurement or to the size of error which might be clinically allowable, which makes interpretation difficult [41]. As outlined in other papers [11,12,40], it is very easy to obtain an artificially high value of CCC and manipulation of the dataset can change the estimate of the CCC drastically. Nevertheless, the variance components are automatically generated in R which helps one to interpret the overall summary indices.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Moreover, interparticipant variability in HR changes is echoed in a previous study that used Fitbit-measured HR as an indicator of stress [ 24 ]. As such, to account for this expected variability, the CIA was computed as it is less dependent on the between-subjects variability compared to the CCC [ 47 , 58 ]. However, the repeatability coefficient of Bland-Altman was found to be unacceptably high and warrants caution when interpreting the CIA.…”
Section: Discussionmentioning
confidence: 99%