2011
DOI: 10.1176/appi.ap.35.1.27
|View full text |Cite
|
Sign up to set email alerts
|

Development and Initial Testing of a Structured Clinical Observation Tool to Assess Pharmacotherapy Competence

Abstract: faculty can feasibly use the P-SCO instrument in a training clinic. Compared with traditional global assessment, the P-SCO provided much more specific feedback information, a better balance of corrective to re-enforcing comments, and a greater spread of ratings related to competency in pharmacotherapy.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(13 citation statements)
references
References 27 publications
0
11
0
Order By: Relevance
“…It allows faculty to provide formative feedback to the learner in real time [8][9][10][11] and tends to generate more specific feedback and constructive comments compared to global assessments. 12,13 At least 55 direct observation tools have been developed, but only a few have proven reliability, validity, or educational outcomes data measured.…”
Section: Overview Of Assessment Methods Identifiedmentioning
confidence: 99%
See 1 more Smart Citation
“…It allows faculty to provide formative feedback to the learner in real time [8][9][10][11] and tends to generate more specific feedback and constructive comments compared to global assessments. 12,13 At least 55 direct observation tools have been developed, but only a few have proven reliability, validity, or educational outcomes data measured.…”
Section: Overview Of Assessment Methods Identifiedmentioning
confidence: 99%
“…It allows faculty to provide formative feedback to the learner in real time [8][9][10][11] and tends to generate more specific feedback and constructive comments compared to global assessments. 12,13 At least 55 direct observation tools have been developed, but only a few have proven reliability, validity, or educational outcomes data measured.14 Faculty training on the use of any direct observation tool is important given the potential for variability of interpretation of a clinical encounter and the tool's language, yet few studies have demonstrated more than cursory observer training.14 There is evidence, however, that even without extensive training, certain tools have good to excellent reliability. 10,15 The correlation between direct observation and other measures of competency such as written test scores, [16][17][18][19][20][21][22][23][24][25] OSCEs, or standardized patient assessments [18][19][20][21]25,26 has been studied in a number of specialties showing modest correlation supporting the validity of certain direct observation methods.…”
mentioning
confidence: 99%
“…Prior to the implementation of the EPA app, faculty used a paper-based direct observation tool that included a comprehensive 27-item checklist, an overall EPA rating, and prompts for both reinforcing and corrective comments. This tool had been studied in several settings with evidence for validity and generates, on average, five highly specific comments with a 3:2 ratio of reinforcing to corrective [11,[41][42][43]. All faculty agreed to participate.…”
Section: Setting and Participantsmentioning
confidence: 99%
“…We used the comments generated by the Pharmacotherapy-Structured Clinical Observation (P-SCO), a direct observation tool in psychiatry, with evidence for validity. [25][26][27] This study has 3 aims: (1) Analyze the quality of the narrative comments generated by the P-SCO; (2) Characterize the themes most commonly captured by the narrative comments; and (3) Examine the relationship between the narrative comments and the checklist scores.…”
Section: Introductionmentioning
confidence: 99%