2016
DOI: 10.12738/estp.2016.1.0329
|View full text |Cite
|
Sign up to set email alerts
|

Examining Differential Item Functions of Different Item Ordered Test Forms According to Item Difficulty Levels

Abstract: The study aims to examine whether differential item function is displayed in three different test forms that have item orders of random and sequential versions (easy-to-hard and hard-to-easy), based on Classical Test Theory (CTT) and Item Response Theory (IRT) methods and bearing item difficulty levels in mind. In the correlational research, the data from a total of 578 seventh graders were gathered using an Atomic Structures Achievement Test. R programming language and "difR" package were employed for all the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 8 publications
1
6
0
Order By: Relevance
“…Just as in the first finding, the effect of test anxiety when students face DE and R arranged items may be the reason for the differences in the performance of students in Mathematics. This finding corroborates the result of previous studies such as Soureshjani (2011);Çokluk et al (2016); Hauck et al (2017); Owan et al (2020b), which document that test items arranged in different order affects the performance of students.…”
Section: Discussionsupporting
confidence: 92%
“…Just as in the first finding, the effect of test anxiety when students face DE and R arranged items may be the reason for the differences in the performance of students in Mathematics. This finding corroborates the result of previous studies such as Soureshjani (2011);Çokluk et al (2016); Hauck et al (2017); Owan et al (2020b), which document that test items arranged in different order affects the performance of students.…”
Section: Discussionsupporting
confidence: 92%
“…Nevertheless, it should be noted that as topic of future investigations, it would be interesting to test the measurement invariance of the M-CTS to ensure suitable group comparisons between men and women, or in between group ages. We strongly recommend to implement specific statistical procedures to test Differential Item Functions based on Classical Test Theory as Logistic Regressions or Lord Chi-square calculation based on the Item Response Theory, for example (Çokluk et al, 2016).…”
Section: Discussionmentioning
confidence: 99%
“…There are numerous studies on IP effects on psychometric item characteristics in the related literature (Hambleton, 1968;Hambleton & Traub, 1974;Kelinke, 1980;Klosner & Gellman, 1973;Leary & Dorans, 1985;Lee, 2007;Newman et al, 1988;Perlini et al, 1998). However, there are fewer studies on whether using different forms or booklets in achievement exams leads to certain psychometric problems such as DIF, and in the majority of these studies, while some focus on item order effects by ordering items from easy to difficult, difficult to easy, or randomly based on item difficulty index (Balta & Omur Sunbul, 2017;Çokluk et al, 2016;Freedle & Kostin, 1991;Plake et al, 1988;Ryan & Chiu, 2001), others focus on IP effects (Avcu et al, 2018;Bulut, 2015;Erdem, 2015). Ryan and Chiu (2001) developed two forms consisting of 40-items which included topics they had addressed, namely algebra, trigonometry, geometry, and analytic geometry.…”
Section: Differential Item Functioning Based On Position Effectsmentioning
confidence: 99%