2020
DOI: 10.5312/wjo.v11.i2.82
|View full text |Cite
|
Sign up to set email alerts
|

Revision total hip arthroplasty: An analysis of the quality and readability of information on the internet

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
6
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 18 publications
1
6
0
1
Order By: Relevance
“… 6 Literature identified since 2015 consistently found content to be of low-to-moderate quality. 7 9 No studies have assessed information content on fragility fractures specifically; however, some recent studies have investigated orthopaedic trauma, including clavicle fractures, scaphoid fractures, distal radius fractures, and pelvic/acetabulum fractures and found information to be poor. 10 11 12 13 The literature on content of osteoporosis websites was mostly published before 2010.…”
mentioning
confidence: 99%
“… 6 Literature identified since 2015 consistently found content to be of low-to-moderate quality. 7 9 No studies have assessed information content on fragility fractures specifically; however, some recent studies have investigated orthopaedic trauma, including clavicle fractures, scaphoid fractures, distal radius fractures, and pelvic/acetabulum fractures and found information to be poor. 10 11 12 13 The literature on content of osteoporosis websites was mostly published before 2010.…”
mentioning
confidence: 99%
“…Prior to analysis these URLs were categorised into the following: Academic, Commercial, Physician, Allied-health, Media-related, Health information website, Social/discussion page, Governmental, Nonpro t organisations, and Unspeci ed [13].…”
Section: Methodsmentioning
confidence: 99%
“…was consulted to reach final consensus of calculated scores amongst all three reviewers. The interobserver variability was evaluated for both the DISCERN and JAMA score using Cohen’s kappa co-efficient (κ) as previously described [ 20 ]. Kappa is calculated by the equation κ = (Pο − Pε)/(1 − Pε), where Pο is the observed agreement among raters and Pε is the probability of agreement by chance [ 21 ].…”
Section: Methodsmentioning
confidence: 99%
“…A κ value above 0.8 implies excellent agreement. When κ = 1, there is complete agreement between the observers [ 20 ].…”
Section: Methodsmentioning
confidence: 99%