2022
DOI: 10.3389/fonc.2022.1013941
|View full text |Cite
|
Sign up to set email alerts
|

Inter-reader agreement of the prostate imaging reporting and data system version v2.1 for detection of prostate cancer: A systematic review and meta-analysis

Abstract: ObjectivesWe aimed to systematically assess the inter-reader agreement of the Prostate Imaging Reporting and Data System Version (PI-RADS) v2.1 for the detection of prostate cancer (PCa).MethodsWe included studies reporting inter-reader agreement of different radiologists that applied PI-RADS v2.1 for the detection of PCa. Quality assessment of the included studies was performed with the Guidelines for Reporting Reliability and Agreement Studies. The summary estimates of the inter-reader agreement were pooled … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 42 publications
1
2
0
Order By: Relevance
“…This level of concordance is notable considering that neither had prior clinical experience with PI-FAB prior to taking part in this study. This outcome aligns with more established image reporting systems in the field, such as Prostate Imaging Quality (PI-QUAL) and PI-RADS, where similar levels of inter-reader agreement have been documented, but performance improved with time [23] , [24] , [25] . The agreement observed in our study reflects the inherent challenges in standardizing new diagnostic tools and underscores the potential for improved consistency as PI-FAB is used over time.…”
Section: Discussionsupporting
confidence: 83%
“…This level of concordance is notable considering that neither had prior clinical experience with PI-FAB prior to taking part in this study. This outcome aligns with more established image reporting systems in the field, such as Prostate Imaging Quality (PI-QUAL) and PI-RADS, where similar levels of inter-reader agreement have been documented, but performance improved with time [23] , [24] , [25] . The agreement observed in our study reflects the inherent challenges in standardizing new diagnostic tools and underscores the potential for improved consistency as PI-FAB is used over time.…”
Section: Discussionsupporting
confidence: 83%
“…PI-RADS hinges on the subjective judgment of radiologists, which is prone to inter-reader variability ( 28 ). A recent meta-analysis ( 29 ) showed varied inter-reader agreements of PI-RADS v2.1 and moderate inter-reader reliability (pooled k value of 0.65) among radiologists for whole gland and TZ lesions. Nonetheless, PI-RAS 2.1 has higher inter-reader reproducibility than version 2.0 ( 30 ).…”
Section: Discussionmentioning
confidence: 99%
“…Another retrospective analysis of men who underwent both DRE and MRI prior to prostatectomy had similar findings, with MRI having higher sensitivity than DRE (59% vs. 41%, p < 0.01) in the detection of extraprostatic disease, but lower specificity (69% vs. 95%, p < 0.01) [12]. Of note, the interobserver variability of DRE in detecting suspicious lesions is greater than that of MRI (k = 0.22 vs. k = 0.57) [13,14].…”
Section: The Role Of Mri In Contemporary Prostate Cancer Management 2...mentioning
confidence: 69%