Background
Despite the movement toward competency-based assessment by accrediting bodies in recent years, there is no consensus on how to best assess medical competence. Direct observation is a useful tool. At the same time, a comprehensive assessment system based on direct observation has been difficult to develop.
Intervention
We developed a system that translates data obtained from checklists of observed behaviors completed during educational activities, including direct observation of clinical care, into a graphic tool (the “radar graph”) usable for both formative and summative assessment. Using unique, observable behaviors to evaluate levels of competency on the Dreyfus scale, we assessed resident performance in 6 learning sites within our residency. Data are represented on a radar graph, which residents and faculty used to recognize both strengths and areas for growth to guide educational planning for the individual learner.
Results
Initial data show that the radar graphs have construct validity because the development process accurately reflects the desired construct, assessors were adequately trained, and the radar graphs demonstrated resident growth over time. A form completion rate of 90% for >1500 disseminated assessments suggests the feasibility of our process.
Conclusions
The radar graph is a promising tool for use in resident feedback and competency assessment. Further research is needed to determine the full utility of the radar graphs, including a better understanding of the tool's reliability and construct validity.