2019
DOI: 10.1002/tea.21611
|View full text |Cite
|
Sign up to set email alerts
|

Thinking beyond the score: Multidimensional analysis of student performance to inform the next generation of science assessments

Abstract: This is the author manuscript accepted for publication and has undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. Please cite this article as

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(6 citation statements)
references
References 36 publications
(30 reference statements)
0
6
0
Order By: Relevance
“…Further, developing assessments to measure not only content knowledge, but also the skills, practices, and competencies that are crucial aspects of science learning, would help us understand better whether the differences in students’ interactions has an effect on various outcome measures. Thus far, aligning and assessing the three‐dimensional learning envisioned by the NGSS has been difficult (Cardozo‐Gaibisso, Kim, Buxton, & Cohen, 2019; Fulmer, Tanas, & Weiss, 2018). Particularly, few studies have examined how physical and virtual labs might differentially impact students’ learning of science practices and ability to plan and conduct experiments.…”
Section: Discussionmentioning
confidence: 99%
“…Further, developing assessments to measure not only content knowledge, but also the skills, practices, and competencies that are crucial aspects of science learning, would help us understand better whether the differences in students’ interactions has an effect on various outcome measures. Thus far, aligning and assessing the three‐dimensional learning envisioned by the NGSS has been difficult (Cardozo‐Gaibisso, Kim, Buxton, & Cohen, 2019; Fulmer, Tanas, & Weiss, 2018). Particularly, few studies have examined how physical and virtual labs might differentially impact students’ learning of science practices and ability to plan and conduct experiments.…”
Section: Discussionmentioning
confidence: 99%
“…Rubric-based scores provide useful information regarding students’ knowledge status with respect to the objectives being measured on the test. There is also information about students’ thinking and reasoning as reflected in their answers, however, that can be missed by the rubric-based scores alone ( Cardozo-Gaibisso et al, 2020 ). For example, each topic could represent a set of possible misconceptions ( Shin et al, 2019 ) or writing style.…”
Section: Discussionmentioning
confidence: 99%
“…Consistent with this call, research has begun to document the affordances of science assessment for cultivating MLs' rich meaning‐making potential. Specifically, studies have examined the potential of codesigning assessment tasks with teachers (e.g., Buxton et al, 2019), conducting multilayered analyses of MLs' responses to those tasks (e.g., Cardozo‐Gaibisso et al, 2020), and interrogating the language ideologies underlying teachers' formative assessment practices in science classrooms (e.g., Lemmi et al, 2019). In this section, we highlight our emerging research on science assessment with MLs that examines innovative approaches to classroom‐based assessment: (a) multimodal, dynamic assessment and (b) translanguaging assessment codesign.…”
Section: Emerging Research In Science Education With Mlsmentioning
confidence: 99%