Abstract:This study examined the evolution of student responses to seven contextually different versions of two Force Concept Inventory questions in an introductory physics course at the University of Arkansas. The consistency in answering the closely related questions evolved little over the seven-question exam. A model for the state of student knowledge involving the probability of selecting one of the multiple-choice answers was developed. Criteria for using clustering algorithms to extract model parameters were exp… Show more
“…Other publications explore these and other analysis methods in greater detail, including item response theory [121,419,420], cluster analysis [421,422], Rasch model based analysis [423], concentration analysis [424], and model analysis [399].…”
Section: Development and Validation Of Concept Inventoriesmentioning
This paper presents a comprehensive synthesis of physics education research at the undergraduate level. It is based on work originally commissioned by the National Academies. Six topical areas are covered: (1) conceptual understanding, (2) problem solving, (3) curriculum and instruction, (4) assessment, (5) cognitive psychology, and (6) attitudes and beliefs about teaching and learning. Each topical section includes sample research questions, theoretical frameworks, common research methodologies, a summary of key findings, strengths and limitations of the research, and areas for future study. Supplemental material proposes promising future directions in physics education research.
“…Other publications explore these and other analysis methods in greater detail, including item response theory [121,419,420], cluster analysis [421,422], Rasch model based analysis [423], concentration analysis [424], and model analysis [399].…”
Section: Development and Validation Of Concept Inventoriesmentioning
This paper presents a comprehensive synthesis of physics education research at the undergraduate level. It is based on work originally commissioned by the National Academies. Six topical areas are covered: (1) conceptual understanding, (2) problem solving, (3) curriculum and instruction, (4) assessment, (5) cognitive psychology, and (6) attitudes and beliefs about teaching and learning. Each topical section includes sample research questions, theoretical frameworks, common research methodologies, a summary of key findings, strengths and limitations of the research, and areas for future study. Supplemental material proposes promising future directions in physics education research.
“…Fitting a model implementing the structure suggested in the original FCI paper on the set of items 1,2,3,4,7,8,12,13,14,15,17,19,20,21,25,28 …”
Section: B Comparison With the Original Fci Modelmentioning
confidence: 99%
“…The structure of student reasoning on the FCI has also been investigated by methods such as model analysis that require the input of a partial model of the concepts measured by the FCI [19]. Model analysis was later shown to be exact only in certain limiting cases [12]. For a summary of these exploratory and nonexploratory methods, see the review by Ding and Beichner [20].…”
Section: Introductionmentioning
confidence: 99%
“…These have included purely exploratory or descriptive methods such as factor analysis [7][8][9], module analysis [10], cluster analysis [11,12], item response theory [13][14][15][16], and item response curves [17,18]. The structure of student reasoning on the FCI has also been investigated by methods such as model analysis that require the input of a partial model of the concepts measured by the FCI [19].…”
Research on the test structure of the Force Concept Inventory (FCI) has largely been performed with exploratory methods such as factor analysis and cluster analysis. Multidimensional Item Response Theory (MIRT) provides an alternative to traditional exploratory factor analysis which allows statistical testing to identify the optimal number of factors. Application of MIRT to a sample of N ¼ 4716 FCI post-tests identified a 9-factor solution as optimal. Additional analysis showed that a substantial part of the identified factor structure resulted from the practice of using problem blocks and from pairs of similar questions. Applying MIRT to a reduced set of FCI items removing blocked items and repeated items produced a 6-factor solution; however, the factors still had little relation the general structure of Newtonian mechanics. A theoretical model of the FCI was constructed from expert solutions and fit to the FCI by constraining the MIRT parameter matrix to the theoretical model. Variations on the theoretical model were then explored to identify an optimal model. The optimal model supported the differentiation of Newton's 1st and 2nd law; of one-dimensional and three-dimensional kinematics; and of the principle of the addition of forces from Newton's 2nd law. The model suggested by the authors of the FCI was also fit; the optimal MIRT model was statistically superior.
“…In a recent paper, Stewart et al (2012) analyze the student responses to contextually different versions of Force Concept Inventory questions, by using a model analysis for the state of student knowledge and ClA methods to characterize the distribution of students' answers. The authors conclude that ClA is an effective method to extract the underlying subgroups in student data and that additional insight may be gained from a further analysis of clustering results.…”
Many research papers have studied the problem of taking a set of data and separating it into subgroups through the methods of Cluster Analysis. However, the variables and parameters involved in Cluster Analysis have not always been outlined and criticized, especially in the field of Science Education. Moreover, in the field of Science Education, a comparison between two different Clustering methods is not discussed in the literature. In this paper two different Cluster Analysis methods are described and the variables and parameters involved are discussed in order to clarify the information that they can supply. The clustering results obtained by using the two methods are compared and showed a good coherence between them. The results are interpreted and compared with the literature. More detail about the relationship between different student conceptions of modeling in physics was obtained.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.