Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2019
DOI: 10.1136/bmjopen-2019-032550
|View full text |Cite
|
Sign up to set email alerts
|

Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study

Abstract: ObjectivesThe study aimed to compare candidate performance between traditional best-of-five single-best-answer (SBA) questions and very-short-answer (VSA) questions, in which candidates must generate their own answers of between one and five words. The primary objective was to determine if the mean positive cue rate for SBAs exceeded the null hypothesis guessing rate of 20%.DesignThis was a cross-sectional study undertaken in 2018.Setting20 medical schools in the UK.Participants1417 volunteer medical students … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
45
0
9

Year Published

2020
2020
2024
2024

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 47 publications
(81 citation statements)
references
References 15 publications
3
45
0
9
Order By: Relevance
“…Thus, it is not possible to say whether removing such answering cues, for example, by using free text responses, would increase the validity of such assessments. Indeed, in this regard, there is some evidence that, at least for semantic knowledge tests, free text response format questions are generally experienced as more difficult compared to the equivalent SRQs 80 . With advances in natural language processing and machine learning comes the increasingly plausible possibility of automating, or semi‐automating, the scoring of such responses.…”
Section: Discussionmentioning
confidence: 99%
“…Thus, it is not possible to say whether removing such answering cues, for example, by using free text responses, would increase the validity of such assessments. Indeed, in this regard, there is some evidence that, at least for semantic knowledge tests, free text response format questions are generally experienced as more difficult compared to the equivalent SRQs 80 . With advances in natural language processing and machine learning comes the increasingly plausible possibility of automating, or semi‐automating, the scoring of such responses.…”
Section: Discussionmentioning
confidence: 99%
“…Given our previous work on VSAQs being more representative of real-life practice than MCQs, it would be interesting to establish the level of learning assessed by this form of student-generated question, by collecting both quantitative and qualitative evaluative data. 13 We hope to expand this work by incorporating this collaborative activity into the students' clinical placements. The aim of this is to promote student engagement with the curriculum and provide further opportunities for students and faculty to identify any learning and teaching gaps.…”
Section: Discussionmentioning
confidence: 99%
“…The demerits of MCQ are many, and there is strong backing for VSAQ, but the transition, as usual, is slow to come. An electronic VSA exam platform has been developed by the UK Medical Schools Council Assessment Alliance to complement their existing SBA platform, which is already widely used by medical schools throughout the UK [24]. We are encouraged by the finding by Sam et al [25] that VSAQ format is capable of high reliability, validity, discrimination, and authenticity, while SBAQ format was associated with significant cueing.…”
Section: Discussionmentioning
confidence: 99%