Multiple‐choice response formats are troublesome, as an item is often scored as solved simply because the examinee may be lucky at guessing the correct option. Instead of pertinent Item Response Theory models, which take guessing effects into account, this paper considers a psycho‐technological approach to re‐conceptualizing multiple‐choice response formats. The free‐response format is compared with two different multiple‐choice formats: a traditional format with a single correct response option and five distractors (‘1 of 6’), and another with five response options, three of them being distractors and two of them being correct (‘2 of 5’). For the latter format, an item is scored as mastered only if both correct response options and none of the distractors are marked. After the exclusion of a few items, the Rasch model analyses revealed appropriate fit for 188 items altogether. The resulting item‐difficulty parameters were used for comparison. The multiple‐choice format ‘1 of 6’ differs significantly from the multiple‐choice format ‘2 of 5’, while the latter does not differ significantly from the free‐response format. The lower difficulty of items ‘1 of 6’ suggests guessing effects.
The effects of varied test order within a computer test battery on test performance were investigated. An experiment was performed to determine whether completing objective personality tests sensu R. B. Cattell affects test performance in subsequent cognitive ability tests and vice versa. The sample consisted of managers of an industrial corporation (an automotive supplier) in “higher management positions” (business managers, department chiefs, and team leaders) who attended an investigation of their professional potential that resembled a real selection situation. It was hypothesized that carry-over and priming effects, as well as fatigue and learning effects might occur. Results of a MANOVA showed a main effect of test order on objective personality tests, since “frustration tolerance” decreased and “decisiveness” increased when objective personality tests were presented subsequent to cognitive ability tests, while cognitive ability tests were not affected by prior objective personality tests.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.