The Framework for K‐12 science education (The Framework) and Next Generation Science Standards (NGSS) emphasize the usefulness of learning progressions in helping align curriculum, instruction, and assessment to organize the learning process. The Framework defines three dimensions of science as the basis of theoretical learning progressions described in the document and used to develop NGSS. The three dimensions include disciplinary core ideas, scientific and engineering practices, and crosscutting concepts. The Framework defines three‐dimensional learning (3D learning) as integrating scientific and engineering practices, crosscutting concepts, and disciplinary core ideas to make sense of phenomena. Three‐dimensional learning leads to the development of a deep, useable understanding of big ideas that students can apply to explain phenomena and solve real‐life problems. While the Framework describes the theoretical basis of 3D learning, and NGSS outlines possible theoretical learning progressions for the three dimensions across grades, we currently have very limited empirical evidence to show that a learning progression for 3D learning can be developed and validated in practice. In this paper, we demonstrate the feasibility of developing a 3D learning progression (3D LP) supported by qualitative and quantitative validity evidence. We first present a hypothetical 3D LP aligned to a previously designed NGSS‐based curriculum. We further present multiple sources of validity evidence for the hypothetical 3D LP, including interview analysis and item response theory (IRT) analysis to show validity evidence for the 3D LP. Finally, we demonstrate the feasibility of using the assessment tool designed to probe levels of the 3D LP for assigning 3D LP levels to individual student answers, which is essential for the practical applicability of any LP. This work demonstrates the usefulness of validated 3D LP for organizing the learning process in the NGSS classroom, which is essential for the successful implementation of NGSS.
Keel bone damage may be painful to birds and affect their production. In order to better understand the frequency, position, and timepoint of keel bone damage that occur during production, the integrity of W-36 laying hen keel bones housed in enriched colony cages at 748.4 cm2 (116 in2) was evaluated. At four time points, 120 birds (10 per cage; three cages per each of four rooms) had keel bones evaluated. Each hen was placed in a motion limiting restraint, scanned using computed tomography (CT), fitted in vests containing tri-axial accelerometers, and placed back in their cages for 21 d. After 21 d, the hens were rescanned and returned to their cages. This process was repeated after 133 d. The CT scans were imported into Mimics analysis software (Materialise, Plymouth, MI, USA); 3D models were made of each keel bone at each time point and exported to 3-matic analysis software (Materialise, Plymouth, MI, USA). Each laying hen's keel bone model was superimposed onto scans from multiple time points resulting in four bone pairings representative of each 21-d period, the 133-d period, and the entire duration of the project. Next, the proximal portion of each bone pairing was edited to normalize bone shape according to a strict protocol. Additionally, each pairing was divided into three portions: distal aspect (3 cm), proximal aspect (2 cm), and middle portion (remaining). Whole bone pairing and each bone portion was analyzed using the Part Comparison tool in 3-matic. Raw data were compiled into three datasets and analyzed in SAS 9.3 using the GLIMMIX procedure using a three-level random intercept model. The model controlled for time, part, part(time), and system with random intercepts of bird(cage) and cage. Overall, results revealed that the greatest morphological changes occurred during the first 21-d period with regards to time (P = 0.03) and in the distal aspect of the keel with regards to part (P < 0.0001).
Multidimensionality and hierarchical data structure are common in assessment data. These design features, if not accounted for, can threaten the validity of the results and inferences generated from factor analysis, a method frequently employed to assess test dimensionality. In this article, we describe and demonstrate the application of the multilevel bifactor model to address these features in examining test dimensionality. The tool for this exposition is the Child Observation Record Advantage 1.5 (COR-Adv1.5), a child assessment instrument widely used in Head Start programs. Previous studies on this assessment tool reported highly correlated factors and did not account for the nesting of children in classrooms. Results from this study show how the flexibility of the multilevel bifactor model, together with useful model-based statistics, can be harnessed to judge the dimensionality of a test instrument and inform the interpretability of the associated factor scores.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.