2021
DOI: 10.1007/s10648-020-09588-0
|View full text |Cite
|
Sign up to set email alerts
|

A Complete SMOCkery: Daily Online Testing Did Not Boost College Performance

Abstract: In an article published in an open-access journal, (Pennebaker et al. PLoS One, 8(11), e79774, 2013) reported that an innovative computer-based system that included daily online testing resulted in better student performance in other concurrent courses and a reduction in achievement gaps between lower and upper middle-class students. This article has had high impact, not only in terms of citations, but it also launched a multimillion-dollar university project and numerous synchronous massive online courses (SM… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…If a high-performing subset of students use an analytical service more than the average student, correlational statistics would fallaciously imply that the service causes improvements in performance, and this correlation would be even stronger if occasional use harms average or low-performing students. Similarly, if an analytical service causes students to withdraw from a course or an institution, and withdrawing also removes these students from analysis, paradoxically this harmful service may appear to improve student performance when benchmarked against comparison groups (attrition bias; e.g., Robinson 2021 ), as may have been the case with the traffic signal example above (Straumsheim, 2013 ). In this way, we imagine institutional analytics initiatives not only incanting dangerous spells, but potentially doing so blindfolded, or worse: under the mistaken impression that the spell is working effectively.…”
mentioning
confidence: 99%
“…If a high-performing subset of students use an analytical service more than the average student, correlational statistics would fallaciously imply that the service causes improvements in performance, and this correlation would be even stronger if occasional use harms average or low-performing students. Similarly, if an analytical service causes students to withdraw from a course or an institution, and withdrawing also removes these students from analysis, paradoxically this harmful service may appear to improve student performance when benchmarked against comparison groups (attrition bias; e.g., Robinson 2021 ), as may have been the case with the traffic signal example above (Straumsheim, 2013 ). In this way, we imagine institutional analytics initiatives not only incanting dangerous spells, but potentially doing so blindfolded, or worse: under the mistaken impression that the spell is working effectively.…”
mentioning
confidence: 99%
“…If a high-performing subset of students use an analytical service more than the average student, correlational statistics would fallaciously imply that the service causes improvements in performance, and this correlation would be even stronger if occasional use harms average or low-performing students. Similarly, if an analytical service causes students to withdraw from a course or an institution, and withdrawing also removes these students from analysis, paradoxically this harmful service may appear to improve student performance when benchmarked against comparison groups (attrition bias; e.g., Robinson, 2021), as may have been the case with the traffic signal example above (Straumsheim, 2013). In this way, we imagine institutional analytics initiatives not only incanting dangerous spells, but potentially doing so blindfolded, or worse: under the mistaken impression that the spell is working effectively.…”
mentioning
confidence: 99%