2011
DOI: 10.1016/j.sbspro.2011.02.035
|View full text |Cite
|
Sign up to set email alerts
|

Multiple Choice and Constructed Response Tests: Do Test Format and Scoring Matter?

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
31
0
7

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(42 citation statements)
references
References 35 publications
2
31
0
7
Order By: Relevance
“…Means that, students used to answer the question more carefully because Multiple-Choice Question provide several similar options. This opinion is also supported by Bradbard, Parker, and Stone, Jenning and Bush as cited in Kastner and Stangl (2011) argued that ideally, there will be one question, several choices, and one correct answer in Multiple-Choice Question in which the other choices are just distraction.…”
Section: Students Answer the Question Carefullymentioning
confidence: 76%
“…Means that, students used to answer the question more carefully because Multiple-Choice Question provide several similar options. This opinion is also supported by Bradbard, Parker, and Stone, Jenning and Bush as cited in Kastner and Stangl (2011) argued that ideally, there will be one question, several choices, and one correct answer in Multiple-Choice Question in which the other choices are just distraction.…”
Section: Students Answer the Question Carefullymentioning
confidence: 76%
“…There are well-studied advantages and disadvantages of both multiple-choice and free-response questions [10]. As the domain knowledge gets more complicated, it becomes more difficult to design multiple-choice tests that accurately reflect the student's level of understanding; on the other hand, free-response questions are not scalable because of the need for human graders.…”
Section: Related Workmentioning
confidence: 99%
“…The two basic item structures used are multiple-choice items that students choose to respond to and constructed response items that students construct themselves (Crocker & Algina, 1986;Roid & Haladyna, 1982). When deciding which item to use, item type which is more suitable for the feature to be measured, is recommended (Kastner & Stangla, 2011;Popham, 2008;Rodriquez, 2002;Roid & Haladyna, 1982). Therefore, the main factor to be considered is the cognitive level of the feature to be measured.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, open-ended items are often needed in mathematics classes. Open-ended items are used in situations where students are asked to form their own answer, such as problem solving (Kastner & Stangla, 2011;Messick, 1994;Park, 2017;Rodriquez, 2002;Roid & Haladyna, 1982). Open-ended items are beneficial if students need to plan and configure the answer to the question at hand (Haladyna, 1997).…”
Section: Introductionmentioning
confidence: 99%