Evaluations of behavioral health interventions have identified many that are potentially effective. However, clinicians and other decision makers typically lack the time and ability to effectively search and synthesize the relevant research literature. In response to this opportunity, and to increasing policy and funding pressures for the use of evidence-based practices, a number of “what works” websites have emerged to assist decision makers in selecting interventions with the highest probability of benefit. However, these registers as a whole are not well understood. This article, which represents phase one of a concurrent mixed methods study, presents a review of the scopes, structures, dissemination strategies, uses, and challenges faced by evidence-based registers in the behavioral health disciplines. The major findings of this study show that in general, registers of evidence-based practices are able, to a degree, to identify the most effective practices and meet the needs of decision makers. However, much needs to be done to improve the ability of the registers to fully realize their purpose.
Decision makers need timely and credible information about the effectiveness of behavioral health interventions. Online evidence-based program registers (EBPRs) have been developed to address this need. However, the methods by which these registers determine programs and practices as being “evidence-based” has not been investigated in detail. This paper examines the evidentiary criteria EBPRs use to rate programs and the implications for how different registers rate the same programs. Although the registers tend to employ a standard Campbellian hierarchy of evidence to assess evaluation results, there is also considerable disagreement among the registers about what constitutes an adequate research design and sufficient data for designating a program as evidence-based. Additionally, differences exist in how registers report findings of “no effect,” which may deprive users of important information. Of all programs on the 15 registers that rate individual programs, 79% appear on only one register. Among a random sample of 100 programs rated by more than one register, 42% were inconsistently rated by the multiple registers to some degree.
BackgroundCurrently the American Red Cross requires that individuals renew their cardiopulmonary resuscitation (CPR) certification annually; this often requires a 4- to 8-hour refresher course. Those trained in CPR often show a decrease in essential knowledge and skills within just a few months after training. New electronic means of communication have expanded the possibilities for delivering CPR refreshers to members of the general public who receive CPR training. The study’s purpose was to determine the efficacy of three novel CPR refreshers - online website, e-mail and text messaging – for improving three outcomes of CPR training - skill retention, confidence for using CPR and intention to use CPR. These three refreshers may be considered “novel” in that they are not typically used to refresh CPR knowledge and skills.MethodsThe study conducted two randomized clinical trials of the novel CPR refreshers. A mailed brochure was a traditional, passive refresher format and served as the control condition. In Trial 1, the refreshers were delivered in a single episode at 6 months after initial CPR training. In Trial 2, the refreshers were delivered twice, at 6 and 9 months after initial CPR training, to test the effect of a repeated delivery. Outcomes for the three novel refreshers vs. the mailed brochure were determined at 12 months after initial CPR training.ResultsAssignment to any of three novel refreshers did not improve outcomes of CPR training one year later in comparison with receiving a mailed brochure. Comparing outcomes for subjects who actually reviewed some of the novel refreshers vs. those who did not indicated a significant positive effect for one outcome, confidence for performing CPR. The website refresher was associated with increased behavioral intent to perform CPR. Stated satisfaction with the refreshers was relatively high. The number of episodes of refreshers (one vs. two) did not have a significant effect on any outcomes.ConclusionsThere was no consistent evidence for the superiority of novel refreshers as compared with a traditional mailed brochure, but the low degree of actual exposure to the materials does not allow a definitive conclusion. An online web-based approach seems to have the most promise for future research on electronic CPR refreshers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.