Background
Comparisons of baseline covariates in randomised controlled trials whilst often undertaken is regarded by many as an exercise in futility. Because of randomisation the null hypothesis is true for baseline comparisons and therefore any differences will occur by chance. However, this is only the case if allocations are not known in advance of recruitment. If this occurs then selection bias at randomisation may be present and it is possible that the statistical testing of covariates may unveil selection bias. In this paper we show that this is particularly the case for cluster randomised trials when post-randomised recruitment often occurs and can lead to selection bias.
Main text
We take a recently published cluster randomised trial that has suffered from selection bias due to differential recruitment and calculate baseline
p
values. We show that statistically significant imbalances of
p
< 0.0001 occurred in 5 of the 10 covariates. In comparison for an individually randomised trial that had no evidence of selection bias only 1
p
value of
p
< 0.05 out of 20 tests was observed. Had baseline
p
values for the cluster trial been presented to journal editors, reviewers and readers then the results of the trial might have been treated with more caution.
Conclusion
We argue that the blanket ban of baseline testing as advocated by some may reduce the chance of identifying deficient cluster randomised trials and this opposition should be reconsidered for cluster trials.
BackgroundImitative (‘copycat’) suicide occurs when a media representation of suicide precipitates a suicide attempt. Several organisations have guidelines for responsible suicide reporting, designed to minimise risk of precipitating imitative suicide. Observational studies have explored the evidence for imitative suicide, showing that a large amount of suicide coverage in the news is followed by higher incidence of deaths from suicide. Those imitating a suicide tend to share characteristics such as gender and age with the person whose death was portrayed. The effect of sharing transgender status has not yet been explored. Suicide attempts are common in the transgender population: 40% have attempted suicide at least once. This systematic review will evaluate to what extent UK newspapers adhere to guidelines when reporting suicides of transgender people.MethodsWe searched the newspaper database Nexis for UK newspaper articles published September 2007–2017 which reported the suicide of a transgender person. One reviewer screened results and applied inclusion and exclusion criteria. A checklist tool of ten criteria was adapted from the suicide reporting guidelines of three organisations. Two reviewers independently applied the checklist to each article, noting any breaches. Disagreements were resolved by discussion. A measure of inter-rater reliability was calculated. Analyses were conducted in SPSS.ResultsThe search found 996 articles, 187 of which were screened in full. The 79 articles which met inclusion criteria concerned 22 individuals’ deaths, and came from 19 newspapers.Every article had ≥1 checklist breach, with a mean of 3.9/10 breaches (95% CIs 3.5 to 4.3). The majority of articles (63.3%) had 3–5/10 breaches. Percentage prevalence of breaches varied between checklist items, with the most commonly breached features being inappropriate headlines, failure to signpost readers to sources of support, and inappropriate descriptions of death/suicide method.The measure of overall inter-rater reliability showed nearly perfect agreement between reviewers (Cohen’s kappa=0.86).DiscussionBreaching responsible reporting guidelines is very common in UK newspapers when covering suicide deaths of transgender people, although a key limitation is that results cannot be extrapolated to internet news sources or social media.Because breaching guidelines has the potential to cause harm-- perhaps more so in transgender than cisgender individuals-- journalists should be aware of imitative suicide and try to minimise risk of harm. Although sometimes breaching guidelines may be justified, the priorities of public health and of journalism should be weighed against each other.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.