The Implicit Association Test (IAT) is widely used in psychology. Unfortunately, the IAT cannot be run within online surveys, requiring researchers who conduct online surveys to rely on third-party tools. We introduce a novel method for constructing IATs using online survey software (Qualtrics); we then empirically assess its validity. Study 1 (student n = 239) found good psychometric properties, expected IAT effects, and expected correlations with explicit measures for survey-software IATs. Study 2 (MTurk n = 818) found predicted IAT effects across four survey-software IATs (d's = 0.82 [Black-White IAT] to 2.13 [insect-flower IAT]). Study 3 (MTurk n = 270) compared survey-software IATs and IATs run via Inquisit, yielding nearly identical results and intercorrelations expected for identical IATs. Survey-software IATs appear reliable and valid, offer numerous advantages, and make IATs accessible for researchers who use survey software to conduct online research. We present all materials, links to tutorials, and an open-source tool that rapidly automates survey-software IAT construction and analysis.
The Implicit Association Test (IAT) is widely used in psychology. Unfortunately, the IAT cannot be run within online surveys, requiring researchers who conduct online surveys to rely on third-party tools. We introduce a novel method for constructing IATs using online survey software (Qualtrics); we then empirically assess its validity. Study 1 (student n = 239) found good psychometric properties, expected IAT effects, and expected correlations with explicit measures for survey-software IATs. Study 2 (MTurk n = 818) found predicted IAT effects across four survey-software IATs (d’s = 0.82 [Black-White IAT] to 2.13 [insect-flower IAT]). Study 3 (MTurk n = 270) compared survey-software IATs and IATs run via Inquisit, yielding nearly identical results and intercorrelations expected for identical IATs. Survey-software IATs appear reliable and valid, offer numerous advantages, and make IATs accessible for researchers who use survey software to conduct online research. We present all materials, links to tutorials, and an open-source tool that rapidly automates survey-software IAT construction and analysis.
Purpose
This paper aims to provide a targeted overview of relevant digital equity gap literature that serves to contextualize the current crisis brought on by the COVID-19 pandemic. Following this review of the literature, the author introduces five guidelines that educators can use to guide their decisions about how to adapt to remote learning. It concludes with an overview and full text of two tools educators and researchers can use to better understand the challenges faced by students: the Digital Equity Gap Interview Protocol and the Digital Equity Gap Survey Instrument.
Design/methodology/approach
This conceptual paper is grounded on the theoretical framework of Martha Nussbaum's “Capability Approach,” which outlines core human capabilities that (if fostered) enable individuals to generate valuable outcomes for themselves.
Findings
It is suggested that it is important to attend to human capabilities when addressing digital equity gaps exacerbated by the pandemic. The author provides two tools that are intended to help individuals gather important information about the communities they serve and/or study.
Research limitations/implications
Both tools provide descriptive information that will contextualize digital equity gaps, should they be present.
Practical implications
This paper provides concrete tools for educators who wish to understand digital equity gaps within the communities they serve.
Social implications
In time of unprecedented distance learning, it is important for both K-12 educators and higher education instructors to understand the technological capabilities of their students. The Digital Equity Gap Interview Protocol and the Digital Equity Gap Survey Instrument give them a place to start.
Originality/value
This paper fulfills an identified need to study and address digital equity gaps.
This paper reports findings from the implementation of a learning analytics-powered Early Warning System (EWS) by academic advisors who were novice users of data-driven learning analytics tools. The information collected from these users sheds new light on how student analytic data might be incorporated into the work practices of advisors working with university students. Our results indicate that advisors predominantly used the EWS during their meetings with students-despite it being designed as a tool to provide information to prepare for meetings and identify students who are struggling academically. This introduction of an unintended audience brings significant design implications to bear that are relevant for learning analytics innovations.
This paper describes research and development around two gameful courses that reimagined their assessment systems to better support student autonomy and promote engagement. We present results from an ongoing classroom-based research study that signals the success of these designs and, in so doing, explore key elements of what we call gameful design: the process of redesigning core elements of a learning environment to better support intrinsic motivation. We describe this process and discuss a set of promising practices for the design of gameful courses. Results from three studies indicate that gameful course design is positively related to students working harder and feeling more in control of their class performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.