2008
DOI: 10.1103/physrevstper.4.010102
|View full text |Cite
|
Sign up to set email alerts
|

Measuring student learning with item response theory

Abstract: We investigate short-term learning from hints and feedback in a Web-based physics tutoring system. Both the skill of students and the difficulty and discrimination of items were determined by applying item response theory ͑IRT͒ to the first answers of students who are working on for-credit homework items in an introductory Newtonian physics course. We show that after tutoring a shifted logistic item response function with lower discrimination fits the students' second responses to an item previously answered i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2011
2011
2016
2016

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 35 publications
(33 citation statements)
references
References 5 publications
0
31
0
Order By: Relevance
“…What features of Web-based homework systems successfully enhance students' problem solving skills? Several instructional strategies have been developed and tested, including the use of alternate types of problems [110][111][112][113][114], adopting an explicit problemsolving framework (i.e., a consistent sequence of problem solving steps) [115], conceptual approaches [116][117][118], cooperative group problem solving [119], and computer homework or tutor systems to help students become better problem solvers [71,120,121].…”
Section: Evaluating the Effectiveness Of Instructional Strategies Formentioning
confidence: 99%
See 1 more Smart Citation
“…What features of Web-based homework systems successfully enhance students' problem solving skills? Several instructional strategies have been developed and tested, including the use of alternate types of problems [110][111][112][113][114], adopting an explicit problemsolving framework (i.e., a consistent sequence of problem solving steps) [115], conceptual approaches [116][117][118], cooperative group problem solving [119], and computer homework or tutor systems to help students become better problem solvers [71,120,121].…”
Section: Evaluating the Effectiveness Of Instructional Strategies Formentioning
confidence: 99%
“…Other publications explore these and other analysis methods in greater detail, including item response theory [121,419,420], cluster analysis [421,422], Rasch model based analysis [423], concentration analysis [424], and model analysis [399].…”
Section: Development and Validation Of Concept Inventoriesmentioning
confidence: 99%
“…This comparison is blunted by the fact that typically 19% of the first responses to a question were preceeded by reference to incourse resources, about a 1:1 ratio with the percentage of wrong answers. (Previously we found this ratio to be 1:3 in spite of a penalty that served to discourage students from giving wrong answers [Lee, Palazzo, Warnakulasooriya, & Pritchard, 2008].) More investigation shows several differences between student behavior on pre and posttest.…”
Section: Pre-and Posttest Resultsmentioning
confidence: 69%
“…Simply telling a student that his answer is wrong does not help him perform very much better on a second attempt. 37 Related to this, questions in the traditional test, although covering the same content, did not reappear in the game-based assessment. As a result, a learning effect might have been present although not observed, due to the absence of second attempts.…”
Section: Score Traditional Testmentioning
confidence: 98%