Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2020
DOI: 10.1002/tea.21657
|View full text |Cite
|
Sign up to set email alerts
|

Identifying patterns of students' performance on simulated inquiry tasks using PISA 2015 log‐file data

Abstract: Previous research has demonstrated the potential of examining log-file data from computer-based assessments to understand student interactions with complex inquiry tasks. Rather than solely providing information about what has been achieved or the accuracy of student responses (product data), students' log files offer additional insights into how the responses were produced (process data). In this study, we examined students' log files to detect patterns of students' interactions with computer-based assessment… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
22
0
3

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1
1

Relationship

2
7

Authors

Journals

citations
Cited by 40 publications
(27 citation statements)
references
References 81 publications
(176 reference statements)
1
22
0
3
Order By: Relevance
“…The 12 most‐used OELEs cover 76% of the papers and differ in their characteristics. Some of the OELEs are associated with standardized assessments, such as the Programme for International Student Assessment (PISA) and the National Assessment of Educational Progress (NAEP) (eg, PISA (Teig et al, 2020), NAEP (Chu & Leighton, 2019)). Furthermore, some OELEs are game‐like, wrapping the scenarios to explore in a story (eg, Crystal Island (Taub et al, 2018), VPA (Jiang et al, 2015), Energy3D (Vieira et al, 2018), TugLet (Käser et al, 2017), BioWorld (Doleck et al, 2016a)), while other OELEs are more closely associated with virtual labs (eg, MicroDYN (Greiff et al, 2016), PhET (Wang, Salehi, et al, 2021), Inq‐ITS (Gobert et al, 2012) and NetLogo (Southavilay et al, 2013)).…”
Section: Resultsmentioning
confidence: 99%
“…The 12 most‐used OELEs cover 76% of the papers and differ in their characteristics. Some of the OELEs are associated with standardized assessments, such as the Programme for International Student Assessment (PISA) and the National Assessment of Educational Progress (NAEP) (eg, PISA (Teig et al, 2020), NAEP (Chu & Leighton, 2019)). Furthermore, some OELEs are game‐like, wrapping the scenarios to explore in a story (eg, Crystal Island (Taub et al, 2018), VPA (Jiang et al, 2015), Energy3D (Vieira et al, 2018), TugLet (Käser et al, 2017), BioWorld (Doleck et al, 2016a)), while other OELEs are more closely associated with virtual labs (eg, MicroDYN (Greiff et al, 2016), PhET (Wang, Salehi, et al, 2021), Inq‐ITS (Gobert et al, 2012) and NetLogo (Southavilay et al, 2013)).…”
Section: Resultsmentioning
confidence: 99%
“…As in PISA 2015, eTIMSS also recorded student process data during the assessment as computer log files. Instead of merely providing information on whether students' responses were correct or incorrect (product data), log files offer additional insights into how the responses were produced or process data (Teig, Scherer, & Kjaernsli, 2020). The vast amount of information stored in these log-file data could open new research avenues for understanding how students interact with computer-based inquiry tasks and shine a light on why some students are more successful at solving inquiry tasks than others (Teig, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…A growing body of research in science education has explored observable, behavioral indicators derived from process data to gain a better understanding of student performance in scientific inquiry. Process data from ILSAs could contribute to this recent research venue as they offer the potential of generalizability across populations within a country, while at the same time provide an opportunity for cross-country validation studies (Teig et al, 2020). By analyzing student process data, such as from eTIMSS 2019, researchers can identify different types of problem solving and inquiry strategies that underlie successful and unsuccessful performance across subjects, grades, or countries.…”
Section: Discussionmentioning
confidence: 99%
“…Article 3: Log File Teig, N., Scherer, R., & Kjaernsli, M. (2019). Identifying patterns of students' performance on simulated inquiry tasks using PISA 2015 log-file data.…”
Section: Overview Of the Articlesmentioning
confidence: 99%