2023
DOI: 10.2147/opth.s401492
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of the Readability and Accountability of Online Patient Education Materials Related to Glaucoma Diagnosis and Treatment

Abstract: To assess the readability and accountability of online patient education materials related to glaucoma diagnosis and treatment. Methods: We conducted a Google search for 10 search terms related to glaucoma diagnosis and 10 search terms related to glaucoma treatment. For each search term, the first 10 patient education websites populated after Google search were assessed for readability and accountability. Readability was assessed using five validated measures: Flesch Reading Ease (FRE), Gunning Fog Index (GFI)… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 31 publications
0
1
0
Order By: Relevance
“…The readability of glaucoma-related ChatGPT responses is consistent with an investigation of the readability of retina-related ChatGPT responses, which was found to be at a second-year collegiate level, emphasizing the comprehension difficulties associated with this chatbot [ 8 ]. Alternative institutional and online patient educational glaucoma materials have shown to be written at a 10th- to 12th-grade level, with online glaucoma information to be presented at a 9th- to 11th-grade level [ 4 , 12 - 16 ]. Given that the American Medical Association recommends less than a seventh-grade readability level for educational materials, AI chatbot responses, apart from Bard (which contains inaccuracies as previously mentioned), were written at levels that would be challenging to comprehend for most patients [ 17 ].…”
Section: Discussionmentioning
confidence: 99%
“…The readability of glaucoma-related ChatGPT responses is consistent with an investigation of the readability of retina-related ChatGPT responses, which was found to be at a second-year collegiate level, emphasizing the comprehension difficulties associated with this chatbot [ 8 ]. Alternative institutional and online patient educational glaucoma materials have shown to be written at a 10th- to 12th-grade level, with online glaucoma information to be presented at a 9th- to 11th-grade level [ 4 , 12 - 16 ]. Given that the American Medical Association recommends less than a seventh-grade readability level for educational materials, AI chatbot responses, apart from Bard (which contains inaccuracies as previously mentioned), were written at levels that would be challenging to comprehend for most patients [ 17 ].…”
Section: Discussionmentioning
confidence: 99%
“…For instance, an advanced approach to formal education for medical students was described by Marin et al The authors provided valuable data on experience with a pilot model of ophthalmology longitudinal integrated clerkships, which improved students' knowledge in this field, but the authors also recognized the need for future studies that would evaluate the relationship between medical curricula and students' interest in ophthalmology. Similarly, a need for accurate information that would allow glaucoma patients to make informed decisions on their condition was recognized in a study by Cohen et al, where authors evaluated patient education materials available online [34,35]. Therefore, in order to ensure that study materials are of high quality, items such as funding, but also harms and outcomes, must be represented in article abstracts.…”
Section: Discussionmentioning
confidence: 99%
“…19 The results in our study align with those in several previous studies that examined the readability of patient education materials in ophthalmologic subspecialties, ranging from pediatric ophthalmology to glaucoma to retina. 9,11,20 Furthermore, a previous study assessed the readability of 31 websites that populated after a Google search for "macular degeneration". 12 Our results showed that the patient education materials found online are written, on average, at a reading level that might limit patient understanding.…”
Section: Discussionmentioning
confidence: 99%
“…Readability indices previously used in several ophthalmology analyses likely serve as reliable indicators of a website's overall understandability. [9][10][11]20,[32][33][34][35] In summary, a combined effort to enhance online patient education materials for Syfovre and other AMD treatments may help patients better understand their disease burden, subsequently enhancing patient follow-up rates and medication compliance, reducing health disparities, and providing patients with the tools necessary to make informed decisions about treatment options for AMD.…”
Section: Discussionmentioning
confidence: 99%