Background: Usability—the extent to which an intervention can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction—may be a key determinant of implementation success. However, few instruments have been developed to measure the design quality of complex health interventions (i.e., those with several interacting components). This study evaluated the structural validity of the Intervention Usability Scale (IUS), an adapted version of the well-established System Usability Scale (SUS) for digital technologies, to measure the usability of a leading complex psychosocial intervention, Motivational Interviewing (MI), for behavioral health service delivery in primary care. Prior SUS studies have found both one- and two-factor solutions, both of which were examined in this study of the IUS. Method: A survey administered to 136 medical professionals from 11 primary-care sites collected demographic information and IUS ratings for MI, the evidence-based psychosocial intervention that primary-care providers reported using most often for behavioral health service delivery. Factor analyses replicated procedures used in prior research on the SUS. Results: Analyses indicated that a two-factor solution (with “usable” and “learnable” subscales) best fit the data, accounting for 54.1% of the variance. Inter-item reliabilities for the total score, usable subscale, and learnable subscale were α = .83, α = .84, and α = .67, respectively. Conclusion: This study provides evidence for a two-factor IUS structure consistent with some prior research, as well as acceptable reliability. Implications for implementation research evaluating the usability of complex health interventions are discussed, including the potential for future comparisons across multiple interventions and provider types, as well as the use of the IUS to evaluate the relationship between usability and implementation outcomes such as feasibility. Plain language abstract: The ease with which evidence-based psychosocial interventions (EBPIs) can be readily adopted and used by service providers is a key predictor of implementation success, but very little implementation research has attended to intervention usability. No quantitative instruments exist to evaluate the usability of complex health interventions, such as the EBPIs that are commonly used to integrate mental and behavioral health services into primary care. This article describes the evaluation of the first quantitative instrument for assessing the usability of complex health interventions and found that its factor structure replicated some research with the original version of the instrument, a scale developed to assess the usability of digital systems.
Youth who enter foster care are at risk of mental health need, but questions arise as to the validity of their self-reported symptomatology. This study examines the screening validity of the youth-report version of the Pediatric Symptom Checklist-17 (PSC-17) in a child welfare population. Data come from 2389 youth who completed a version of the PSC-17 adapted for youth report, and their biological and foster parents who completed the parent-report version. Youth also completed a shortened version of the Screen for Child Anxiety Related Disorders (SCARED). Convergent and discriminant validity of the PSC-17 was assessed using multi-trait multi-method matrices. The PSC-17's internalizing subscale was strongly correlated, attention subscale was moderately correlated, and externalizing subscale was weakly correlated with the SCARED's anxiety and PTSD subscales. Comparing youth and foster parent scores, the PSC-17 had moderate convergent validity and weak/fair discriminant validity. Comparing youth, foster parent, and biological parent scores, the PSC-17 had moderate convergent validity and weak/fair discriminant validity. The current study provides some support for the validity of the PSC-17 for the population of youth in foster care.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.