2020
DOI: 10.1167/iovs.61.6.27
|View full text |Cite
|
Sign up to set email alerts
|

Assessing Intereye Symmetry and Its Implications for Study Design

Abstract: This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…CCT and axial length) and biomechanical parameters (e.g. IOP and corneal displacements) in healthy subjects, [46][47][48][49][50] the potential variance between the left and right eyes could have obscured some correlations and rendered them undetectable in the present study. A comparison between the left and right eyes ONH/PPS shear strains in six pairs of donor eyes is presented in the Supplementary Material to provide data on the potential variance.…”
Section: Discussionmentioning
confidence: 74%
“…CCT and axial length) and biomechanical parameters (e.g. IOP and corneal displacements) in healthy subjects, [46][47][48][49][50] the potential variance between the left and right eyes could have obscured some correlations and rendered them undetectable in the present study. A comparison between the left and right eyes ONH/PPS shear strains in six pairs of donor eyes is presented in the Supplementary Material to provide data on the potential variance.…”
Section: Discussionmentioning
confidence: 74%
“…Ninety-one (91%) of 100 infants were in agreement between 2 eyes in the status of RW-ROP from clinical eye examination, with Kappa of 0.76 (95% CI = 0.61–0.91). 13 …”
Section: Resultsmentioning
confidence: 99%
“…The disease agreement between the two eyes of the same patient was assessed using the kappa statistic ( κ ), which measures the ratio between the observed proportion of agreement and the proportion of agreement expected by chance. Kappa values range from + 1 (perfect agreement) to -1 (perfect disagreement) [ 22 ].…”
Section: Methodsmentioning
confidence: 99%