Assessment is critical to health service psychology and represents a core area of coverage during doctoral training. Despite this, training practices in assessment are understudied. Accordingly, this study utilized a national sampling of students (n = 534) enrolled in an American Psychological Association–accredited health service psychology doctoral program with substantive training in clinical or counseling psychology. We asked trainees to rate their competency for instruments in which they had training. We examined trends in training experiences, including both theory-based education and applied clinical opportunities, and explored differences in instrument training trends across program type (PhD/PsyD) and program discipline (clinical/counseling). Results of this study suggest a general convergence with professional practice trends in terms of instrument coverage, less clinical training, and exposure compared with didactic methods and generally small differences across program type and discipline in perceived competence and instrument exposure. Implications for training and education in psychological assessment are discussed.
Justice-involved youth experience significantly higher rates of adverse childhood experiences (ACEs) compared to the general population, which lead to negative outcomes such as greater criminal involvement and mental health disorders. Such effects emphasize the need to examine the role of protective factors on the development of these negative outcomes. This study uses data from 519 youth referred to a probation department in Southeast Texas to examine the effects of ACEs and the direct and mitigating effects of protective factors on a youth’s criminal involvement and mental health symptoms. Results from hierarchical linear regression models emphasize the negative effects of ACEs on these outcomes, as well as the potential ceiling effect of protective factors based on ACE severity.
Background: The predoctoral internshipAQ4 training year is the capstone training experience for health service doctoral students. Previous research has explored what applicant characteristics are desired by internship sites and has not thoroughly explored differences between types of sites or criteria importance at different stages of applicant consideration (interview vs. ranking). Aims: We evaluate current perceptions of doctoral student internship applications by training directors. Materials and Methods: Internship training directors of APA-accredited sites report on the importance of different application materials during interview and ranking decisions. We also compare these rankings across site types.Results: Results indicate that internship sites were generally consistent in their criteria rankings; however, there were also some differences. Intern applicant "fit" continues to be the most important criteria by which applicants are judged at all stages of consideration. Qualitative analysis found that "fit" varied by site across themes of treatment, applicant, and site characteristics.Abbreviations: ANOVA, analysis of variance; APA, american psychological association; APPIC, association of psychology postdoctoral and internship center; EBP, evidence-based practices; TD, training directors; UCC, university counseling centers; VA, veteran affairs.Discussion: We discuss implications in their preparation of internship applications. In addition to the practical guidance for students, we discuss how program changes can increase applicant site competitiveness.
Objective
Attaining competence in assessment is a necessary step in graduate training and has been defined to include multiple domains of training relevant to this attainment. While important to ensure trainees meet these standards of training, it is critical to understand how and if competence shapes a trainees' professional identity, therein promoting lifelong competency.
Methods
The current study assessed currently enrolled graduate trainees' knowledge and perception of their capabilities related to assessment to determine if self‐reported and performance‐based competence would incrementally predict their intention to use assessment in their future above basic training characteristics and intended career interests.
Results
Self‐reported competence, but not performance‐based competence, played an incremental role in trainees' intention to use assessments in their careers. Multiple graduate training characteristics and practice experiences were insignificant predictors after accounting for other relative predictors (i.e., intended career settings, integrated reports).
Conclusion
Findings are discussed about the critical importance of incorporating a hybrid competency‐capability assessment training framework to further emphasize the role of trainee self‐efficacy in hopes of promoting lifelong competence in their continued use of assessments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.