2020
DOI: 10.1007/s11092-020-09335-7
|View full text |Cite
|
Sign up to set email alerts
|

Item position effects in listening but not in reading in the European Survey of Language Competences

Abstract: In contrast with the assumptions made in standard measurement models used in large-scale assessments, students' performance may change during the test administration. This change can be modeled as a function of item position in case of a test booklet design with item-order manipulations. The present study used an explanatory item response theory (IRT) framework to analyze item position effects in the 2012 European Survey on Language Competences. Consistent item position effects were found for listening but not… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(8 citation statements)
references
References 35 publications
(41 reference statements)
0
8
0
Order By: Relevance
“…For example, studies had found that open-response items were more likely to be omitted than multiple-choice items, and difficult items were more likely to be omitted than easy items (Okumura, 2014;Rose et al, 2010). In item position effect studies using PISA data, the method of handling missing data is generally similar (Christiansen & Janssen, 2020;Trendtel & Robitzsch, 2018). In these studies, PISA scoring procedures were generally used, missing responses on omitted items were treated as incorrect, and all other missing responses were treated as not administered (OECD, 2009(OECD, , 2012(OECD, , 2014(OECD, , 2017.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…For example, studies had found that open-response items were more likely to be omitted than multiple-choice items, and difficult items were more likely to be omitted than easy items (Okumura, 2014;Rose et al, 2010). In item position effect studies using PISA data, the method of handling missing data is generally similar (Christiansen & Janssen, 2020;Trendtel & Robitzsch, 2018). In these studies, PISA scoring procedures were generally used, missing responses on omitted items were treated as incorrect, and all other missing responses were treated as not administered (OECD, 2009(OECD, , 2012(OECD, , 2014(OECD, , 2017.…”
Section: Discussionmentioning
confidence: 99%
“…The items scored in partial credit were dichotomized by scoring the full credit as correct (1) and all partial credits as incorrect (0). In this study, the same procedure as PISA used for item calibration, missing responses on omitted items (no response) were treated as incorrect and all other missing responses (not reached) were treated as not administered (Christiansen & Janssen, 2020;OECD, 2017;Trendtel & Robitzsch, 2018).…”
Section: Data Collection Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…A general item position effect has been investigated in many studies in the literature (Christiansen & Janssen, 2020;Debeer & Janssen, 2013;Hahne, 2008;Meyers et al,2009;Nagy et al, 2018;Weirich et al, 2017). Demirkol & Kelecioğlu (2022) found that there is a general item position effect in the PISA 2015 reading and mathematics data, and the probability of correct answers decreases when the items are located in later positions.…”
Section: Discussionmentioning
confidence: 99%
“…The item position effect differed among students, that is, the students were not exposed to the same level of item position effect (Christiansen & Janssen, 2020;Deeber & Janssen, 2013;Demirkol & Kelecioğlu, 2022). These results raised the question of what individual characteristics might be related to the item position effect, and the relationship between different student characteristics and the item position effect was investigated (Smouse & Munz, 1968;Nagy et al, 2018;Qian, 2014;Weirich et al, 2017;Wu et al, 2019).…”
Section: Introductionmentioning
confidence: 99%