Background IntelliCare is a modular platform that includes 12 simple apps targeting specific psychological strategies for common mental health problems. Objective This study aimed to examine the effect of 2 methods of maintaining engagement with the IntelliCare platform, coaching, and receipt of weekly recommendations to try different apps on depression, anxiety, and app use. Methods A total of 301 participants with depression or anxiety were randomized to 1 of 4 treatments lasting 8 weeks and were followed for 6 months posttreatment. The trial used a 2X2 factorial design (coached vs self-guided treatment and weekly app recommendations vs no recommendations) to compare engagement metrics. Results The median time to last use of any app during treatment was 56 days (interquartile range 54-57), with 253 participants (84.0%, 253/301) continuing to use the apps over a median of 92 days posttreatment. Receipt of weekly recommendations resulted in a significantly higher number of app use sessions during treatment (overall median=216; P=.04) but only marginal effects for time to last use (P=.06) and number of app downloads (P=.08). Coaching resulted in significantly more app downloads (P<.001), but there were no significant effects for time to last download or number of app sessions (P=.36) or time to last download (P=.08). Participants showed significant reductions in the Patient Health Questionnaire-9 (PHQ-9) and Generalized Anxiety Disorder-7 (GAD-7) across all treatment arms (P s<.001). Coached treatment led to larger GAD-7 reductions than those observed for self-guided treatment (P=.03), but the effects for the PHQ-9 did not reach significance (P=.06). Significant interaction was observed between receiving recommendations and time for the PHQ-9 (P=.04), but there were no significant effects for GAD-7 (P=.58). Conclusions IntelliCare produced strong engagement with apps across all treatment arms. Coaching was associated with stronger anxiety outcomes, and receipt of recommendations enhanced depression outcomes. Trial Registration ClinicalTrials.gov NCT02801877; https://clinicaltrials.gov/ct2/show/NCT02801877
BackgroundThe ability to successfully recruit participants for electronic health (eHealth) clinical trials is largely dependent on the use of efficient and effective recruitment strategies. Determining which types of recruitment strategies to use presents a challenge for many researchers.ObjectiveThe aim of this study was to present an analysis of the time-efficiency and cost-effectiveness of recruitment strategies for eHealth clinical trials, and it describes a framework for cost-effective trial recruitment.MethodsParticipants were recruited for one of 5 eHealth trials of interventions for common mental health conditions. A multipronged recruitment approach was used, including digital (eg, social media and Craigslist), research registry-based, print (eg, flyers and posters on public transportation), clinic-based (eg, a general internal medicine clinic within an academic medical center and a large nonprofit health care organization), a market research recruitment firm, and traditional media strategies (eg, newspaper and television coverage in response to press releases). The time costs and fees for each recruitment method were calculated, and the participant yield on recruitment costs was calculated by dividing the number of enrolled participants by the total cost for each method.ResultsA total of 777 participants were enrolled across all trials. Digital recruitment strategies yielded the largest number of participants across the 5 clinical trials and represented 34.0% (264/777) of the total enrolled participants. Registry-based recruitment strategies were in second place by enrolling 28.0% (217/777) of the total enrolled participants across trials. Research registry-based recruitment had a relatively high conversion rate from potential participants who contacted our center for being screened to be enrolled, and it was also the most cost-effective for enrolling participants in this set of clinical trials with a total cost per person enrolled at US $8.99.ConclusionsOn the basis of these results, a framework is proposed for participant recruitment. To make decisions on initiating and maintaining different types of recruitment strategies, the resources available and requirements of the research study (or studies) need to be carefully examined.
Implementing a digital mental health service in primary care requires integration into clinic workflow. However, without adequate attention to service design, including designing referral pathways to identify and engage patients, implementation will fail. This article reports results from our efforts designing referral pathways for a randomized clinical trial evaluating a digital service for depression and anxiety delivered through primary care clinics. We utilized three referral pathways: direct to consumer (e.g., digital and print media, registry emails), provider referral (i.e., electronic health record [EHR] order and provider recommendation), and other approaches (e.g., presentations, word of mouth). Over the 5-month enrollment, 313 individuals completed the screen and reported how they learned about the study. Penetration was 13%, and direct to consumer techniques, most commonly email, had the highest yield. Providers only referred 16 patients through the EHR, half of whom initiated the screen. There were no differences in referral pathway based on participants’ age, depression severity, or anxiety severity at screening. Ongoing discussions with providers revealed that the technologic implementation and workflow design may not have been optimal to fully affect the EHR-based referral process, which potentially limited patient access. Results highlight the importance of designing and evaluating referral pathways within service implementation, which is important for guiding the implementation of digital services into practice. Doing so can ensure that sustained implementation is not left to post-evaluation bridge-building. Future efforts should assess these and other referral pathways implemented in clinical practice outside of a research trial.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.