This study examined the relationship of multiple‐choice and free‐response items contained on the College Board's Advanced Placement Computer Science (APCS) examination. Confirmatory factor analysis was used to test the fit of a two‐factor model where each item format marked its own factor. Results showed a single‐factor solution to provide the most parsimonious fit in each of two random‐half samples. This finding might be accounted for by several mechanisms, including overlap in the specific processes assessed by the multiple‐choice and free‐response items and the limited opportunity for skill differentiation afforded by the year‐long APCS course.
SUMMARY State Education Statistics is a table of information, aggregated to the state level. Its goal appears to be that of a thermometer to measure the health of the American educational system. As such, its usefulness is limited by three problems: Some outcome variables–ACT and SAT test scores–are produced by a highly self‐selected sample of individuals, so that the differences observed in outcome can be caused by differences in performance, differences in selection ratio and/or differences in selection rules. We were unable to disentangle these through statistical adjustments. Some input variables were calculated on the population as a whole (i.e., per capita income, average teachers' salary) rather than on the group characterized by the outcome variable (i.e., those individuals taking the SAT). These are often profoundly different and can cause substantial biases in the estimation of statistical measures of the interrelationships among the variables. Some input variables are inappropriately measured or scaled for the purposes intended. For example, teachers' salaries are given in current rather than constant dollars. Thus, one might conclude that over the ten years considered teachers' salaries went up 81%, rather than the more correct conclusion that, in constant dollars, they declined 20%. Among the recommendations proposed to solve these problems is an expansion of the National Assessment of Educational Progress (NAEP) to allow accurate state estimates of students performance, as well as expanding the range of areas covered to more fully measure the section of the students' population intending postsecondary education.
This study examined the relationship of multiple‐choice and free‐response items contained on the College Board's Advanced Placement Computer Science (APCS) examination. Confirmatory factor analysis was used to test the fit of a two‐factor model where each item format marked its own factor. Results showed a single‐factor solution to fit the data best in each of two random‐half samples. This finding might be accounted for by several mechanisms, including overlap in the specific processes assessed by the multiple‐choice and free‐response items, and the limited opportunity for skill differentiation afforded by the year‐long APCS course.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.