Classroom response systems (CRSs) can be potent tools for teaching physics. Their efficacy, however, depends strongly on the quality of the questions used. Creating effective questions is difficult, and differs from creating exam and homework problems. Every CRS question should have an explicit pedagogic purpose consisting of a content goal, a process goal, and a metacognitive goal. Questions can be engineered to fulfil their purpose through four complementary mechanisms: directing students' attention, stimulating specific cognitive processes, communicating information to instructor and students via CRS-tabulated answer counts, and facilitating the articulation and confrontation of ideas. We identify several tactics that help in the design of potent questions, and present four "makeovers" showing how these tactics can be used to convert traditional physics questions into more powerful CRS questions.
Classroom response systems (CRSs) are a promising instructional technology, but most literature on CRS use fails to distinguish between technology and pedagogy, to define and justify a pedagogical perspective, or to discriminate between pedagogies. Technology-enhanced formative assessment (TEFA) is our pedagogy for CRSbased science instruction, informed by experience and by several traditions of educational research. In TEFA, four principles enjoin the practice of question-driven instruction, dialogical discourse, formative assessment, and metalevel communication. These are enacted via the question cycle, an iterative pattern of CRS-based questioning that can serve multiple instructional needs. TEFA should improve CRS use and help teachers ''bridge the gap'' between educational research findings and practical, flexible classroom strategies for science instruction.
The purpose of this study is to uncover and understand the factors that affect secondary science and mathematics teachers' initial implementation of Technology-Enhanced Formative Assessment (TEFA), a pedagogy developed for teaching with classroom response system (CRS) technology. We sought to identify the most common and strongest factors, and to understand the general process of how teachers adopt TEFA. We identified ten main hindering factors reported by teachers, and found that time limitations and question development difficulties are reported as the most problematic. In this paper we provide five vignettes of teachers' initial implementation experiences, illustrating different courses that TEFA adoption can follow. We classify our ten factors into four groups: contextual factors that directly hinder teachers' attempts to implement TEFA (extrinsic type I); circumstances that affect teachers' teaching in general (extrinsic type 0); gaps that teachers have in the knowledge and skills they need to adopt TEFA (intrinsic type I); and ways of being a teacher that describe teachers' deeper perspectives and beliefs, which may be consonant or dissonant with TEFA (intrinsic type II). Finally, we identify four general categories that describe the teachers' initial TEFA implementation.
Technology-Enhanced Formative Assessment (TEFA) is an innovative pedagogy for science and mathematics instruction. The 'Teacher Learning of TEFA' research project studies teacher change as in-service secondary science and mathematics teachers learn TEFA in the context of a multi-year professional development programme. Applying cultural-historical activity theory (CHAT) to the linked activity systems of professional development and teachers' classroom practice leads to a model of teacher learning and pedagogical change in which TEFA is first introduced into classrooms as an object of activity, and then made useful as a tool for instruction, and then-in rare cases-incorporated into all elements of a deeply transformed practice. Different levels of contradiction within and between activity systems drive the transitions between stages. A CHAT analysis suggests that the primary contradiction within secondary education is a dual view of students as objects of instruction and of students as willful individuals; the difficulties arising from this can either inhibit or motivate TEFA adoption.
Traditional problem-based exams are not efficient instruments for assessing the "structure" of physics students' conceptual knowledge or for providing diagnostically detailed feedback to students and teachers. We present the Free Term Entry task, a candidate assessment instrument for exploring the connections between concepts in a student's understanding of a subject. In this task, a student is given a general topic area and asked to respond with as many terms from the topic area as possible in a given time; the "thinking time" between each term-entry event is recorded along with the response terms. The task was given to students from two different introductory physics classes. Response term thinking times were found to correlate with the strength of the association between two concepts. In addition, sets of thinking times from the task show distinct, characteristic patterns which might prove valuable for student assessment. We propose a quantitative dynamical model named the Matrix Walk Model which is able to match many aspects of the observed data. One particular feature of the data -a distinct "spike" superimposed on the otherwise log-normal distribution of most thinking time sets -has not been fit. The spike, other patterns observed in the data, and the proposed phenomenological model could all benefit from a grounding in cognitive theory.
Abstract. Students take a two-phase exam twice: once individually, and a second time working in teams. Proponents hope that during the team phase, students will discuss, debate, and resolve questions by sharing their reasoning, challenging each other, and reaching consensus. Potential adopters fear that students might uncritically follow the majority answer or mimic one dominant team member. To explore this empirically, I data-mined students' solo-and team-phase responses from the final exams of three different introductory physics courses to construct multiple measures of team dynamics. My results substantiate prior findings that teams do engage in meaningful debate and explore the virtues of various possible answers. The two-phase exam implementation used does not force teams to submit a common answer, allows students to "hedging their bets" for partial credit, and incentivizes helping teammates.
Audience response systems (ARS) are a tool, not a magic bullet. How they are used, and how well they are integrated into a coherent pedagogical approach, determines how effective they are. Question Driven Instruction (QDI) is a radical approach in which an ARS-mediated “question cycle” organizes classroom instruction, replacing the “transmit and test” paradigm with an iterative process of question posing, deliberation, commitment to an answer, and discussion. It is an implementation of “real-time formative assessment.” In QDI, an ARS is used to facilitate and direct discussion, to engage students in active knowledge-building, and to support “agile teaching” by providing the instructor with constant feedback about students’ evolving understanding and difficulties. Class time is used primarily for interactively developing understanding, rather than for presenting content: in QDI, an instructor is more an engineer of learning experiences than a dispenser of knowledge. This requires new teaching skills, such as moderating discussion and managing the classroom dynamic, interpreting students’ statements and modeling their learning, making real-time teaching decisions, and designing ARS questions that teach rather than test and that target process as well as content. Above all, it requires understanding and communicating that ARS use is diagnostic and instructional, rather than evaluative.
UNCG has an innovative Learning Assistant (LA) program, in which upper-class undergraduate physics majors teach laboratory sections of the introductory calculus-based physics sequence. The lecture section's professor provides supervision and determines the overall learning objectives and structure of the labs, but the team of LAs develop the detailed lesson plans, write up all handouts and quizzes, conduct the lab sessions, and evaluate student work. This gives the LAs a genuine voice in planning and teaching, and increases the authenticity of the teaching experience. In order to investigate the impact of this teaching experience upon physics majors, we interviewed five current and former LAs. We analyzed the interview transcripts via emergent thematic analysis to identify the most prevalent impacts, and then viewed the results through the lens of professional identity development. We claim that the LA experience helps grow three aspects of physics majors' professional identity: their sense of themselves as a physics teacher, as a physics student, and as a member of a community of practice.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.