2017
DOI: 10.1002/jee.20162
|View full text |Cite
|
Sign up to set email alerts
|

Creating an Instrument to Measure Student Response to Instructional Practices

Abstract: Background Calls for the reform of education in science, technology, engineering, and mathematics (STEM) have inspired many instructional innovations, some research based. Yet adoption of such instruction has been slow. Research has suggested that students' response may significantly affect an instructor's willingness to adopt different types of instruction.Purpose We created the Student Response to Instructional Practices (StRIP) instrument to measure the effects of several variables on student response to in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
89
0
1

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
3
2

Relationship

3
6

Authors

Journals

citations
Cited by 70 publications
(92 citation statements)
references
References 70 publications
2
89
0
1
Order By: Relevance
“…4) Student Survey: In this paper, an end-of-semester survey (N = 57) was used to collect data about students' instructional preferences and response to active learning. The survey was developed as part of a larger study examining student response to active learning [54]. The survey asked students how they reacted to specific types of active learning and whether they preferred more or less of the activities.…”
Section: B Data Sourcesmentioning
confidence: 99%
“…4) Student Survey: In this paper, an end-of-semester survey (N = 57) was used to collect data about students' instructional preferences and response to active learning. The survey was developed as part of a larger study examining student response to active learning [54]. The survey asked students how they reacted to specific types of active learning and whether they preferred more or less of the activities.…”
Section: B Data Sourcesmentioning
confidence: 99%
“…Previous work highlighted the reliability of the passive lecture construct across pilot and final survey data (DeMonbrun et al, 2017). In this data set, passive lecture had a construct reliability of 0.71 (two items) and was reliable across the 17 courses.…”
Section: Quantitative Methodsmentioning
confidence: 85%
“…The StRIP survey instrument was designed, validated, and piloted in our previous studies (DeMonbrun et al, 2017;Nguyen, Borrego, et al, 2016;Nguyen, Shekhar, et al, 2016;Shekhar et al, 2015). Through revisions and classroom observations (Shekhar et al, 2015), we chose 14 types of instruction that capture what teaching methods are occurring in engineering classrooms.…”
Section: Methodsmentioning
confidence: 99%
“…Responses for affective measures were also initially based on previous research [8], but we modified them to accurately capture themes in the full texts. Our pair-coding resulted in the following changes: (1) satisfaction and course evaluation were combined, and refer to how students rated the class or instructor (because most papers did not distinguish these ideas); (2) students' self-reporting learning was considered an affective measure, not a cognitive one, because it defines what students felt they learned; and (3) use of the term motivation was broken down into motivation to participate, attend class, continue in STEM, or another meaning.…”
Section: Coding Full Textsmentioning
confidence: 99%