2000
DOI: 10.1016/s0895-4356(99)00174-2
|View full text |Cite
|
Sign up to set email alerts
|

Bias and prevalence effects on kappa viewed in terms of sensitivity and specificity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
175
0
7

Year Published

2007
2007
2020
2020

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 237 publications
(184 citation statements)
references
References 10 publications
2
175
0
7
Order By: Relevance
“…We therefore calculated the proportions of positive agreement as an alternative agreement measure, which showed similar results. Future studies assessing the reliability of ADE reporting are advised to recruit a more balanced group of patients experiencing and not experiencing ADEs [51]. Based on a combined approach, that is, looking at kappa values, alternative agreement measures, and additional analysis of ADEs at patient level, we conclude that our questionnaire was not sufficiently reliable at the ADEspecific level.…”
Section: Discussionmentioning
confidence: 89%
See 1 more Smart Citation
“…We therefore calculated the proportions of positive agreement as an alternative agreement measure, which showed similar results. Future studies assessing the reliability of ADE reporting are advised to recruit a more balanced group of patients experiencing and not experiencing ADEs [51]. Based on a combined approach, that is, looking at kappa values, alternative agreement measures, and additional analysis of ADEs at patient level, we conclude that our questionnaire was not sufficiently reliable at the ADEspecific level.…”
Section: Discussionmentioning
confidence: 89%
“…For ADE reporting, however, a skewed distribution is observed where many patients report no ADEs on both measurements, which decreases the kappa values used for the reliability assessment [51,52]. Formulas to adjust for such effects have been proposed, for example, the prevalence-adjusted bias-adjusted kappa [53], but their inappropriateness has also been demonstrated [51].…”
Section: Discussionmentioning
confidence: 99%
“…For example, in situations where physicians chose a particular readmission contributor infrequently, chance agreement is high and kappa is accordingly reduced. While this Bpenalty^is felt by many to be appropriate, [18][19][20] we attempted to minimize this phenomenon by reporting kappa values only for aggregate categories of factors predicting readmissions. The fact that kappa was generally low even for highly prevalent selections argues that the prevalence phenomenon does not explain the low rates of agreement we observed.…”
Section: Discussionmentioning
confidence: 99%
“…Although the residual also comprises "conventional" error due to unsystematic impacts, we would argue that most of such unsystematic impacts may be considered to be produced by contextual particularities. An interpretation of the 23 residual predominance could be that even with "optimal" raters and items, there is a limit as to how much the level of agreement can be improved when contextual circumstances are not sufficiently controlled for. Even so, the result also indicates that the level of agreement can be substantially improved by identifying and counteracting those characteristics of the components tending to negatively influence agreement.…”
Section: Discussionmentioning
confidence: 99%
“…Barrier prevalence estimate was classified in five distributional categories, 0-20%, 21-40%, 41-60%, 61-80% and 81-100%. Moreover, following a suggestion that a balanced prevalence near 50% [23] should be targeted for the "fairest" assessment of the level of agreement, the items with prevalence estimate in the interval 41-60% were highlighted in an extended analysis. Due to the results of the predictors of agreement variation analysis described below, splitting the mean levels by rater and item characteristics did not include housing adaptation experience.…”
Section: Level Of Agreement Analysismentioning
confidence: 99%