Two sets of mathematical reasoning and two sets of verbal comprehension items were cast into each of three formats—constructed response, standard multiple-choice, and Coombs multiple- choice—in order to assess whether tests with iden tical content but different formats measure the same attribute, except for possible differences in error variance and scaling factors. The resulting 12 tests were administered to 199 eighth-grade stu dents. The hypothesis of equivalent measures was rejected for only two comparisons: the con structed-response measure of verbal comprehen sion was different from both the standard and the Coombs multiple-choice measures of this ability. Maximum likelihood factor analysis confirmed the hypothesis that a five-factor structure will give a satisfactory account of the common variance among the 12 tests. As expected, the two major factors were mathematical reasoning and verbal comprehension. Contrary to expectation, only one of the other three factors bore a (weak) resem blance to a format factor. Tests marking the abili ty to follow directions, recall and recognition memory, and risk-taking were included, but these variables did not correlate as expected with the three minor factors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.