College students lack fact-checking skills, which may lead them to accept information at face value. We report findings from an institution participating in the Digital Polarization Initiative (DPI), a national effort to teach students lateral reading strategies used by expert fact-checkers to verify online information. Lateral reading requires users to leave the information (website) to find out whether someone has already fact-checked the claim, identify the original source, or learn more about the individuals or organizations making the claim. Instructor-matched sections of a general education civics course implemented the DPI curriculum (N = 136 students) or provided business-as-usual civics instruction (N = 94 students). At posttest, students in DPI sections were more likely to use lateral reading to fact-check and correctly evaluate the trustworthiness of information than controls. Aligning with the DPI’s emphasis on using Wikipedia to investigate sources, students in DPI sections reported greater use of Wikipedia at posttest than controls, but did not differ significantly in their trust of Wikipedia. In DPI sections, students who failed to read laterally at posttest reported higher trust of Wikipedia at pretest than students who read at least one problem laterally. Responsiveness to the curriculum was also linked to numbers of online assignments attempted, but unrelated to pretest media literacy knowledge, use of lateral reading, or self-reported use of lateral reading. Further research is needed to determine whether improvements in lateral reading are maintained over time and to explore other factors that might distinguish students whose skills improved after instruction from non-responders.
This study examined how undergraduate students (N = 153, M age = 19.3 years, 58.8% female) in an Introductory Psychology course experienced the transition to fully online instruction during the COVID-19 outbreak in New York City during spring 2020. We examined predictors of online submission of assignments throughout the semester and students' attitudes toward online learning at the end of the semester. Students tended to report the transition to remote instruction as disruptive to their learning. Students with more positive attitudes toward unmitigated in-class use of personal technologies at the start of the semester reported higher rates of digital multitasking while working from home and more negative attitudes about the course transition. Underrepresented minority (URM) students held more negative attitudes about the transition, while gender and URM status were associated with variation in students' submission of online assignments at specific times. Responses to an end-of-semester question about challenges faced while learning online suggested that students experienced multiple challenges, especially staying motivated and focused and maintaining adequate access to the internet and internet-enabled devices. Students' self-efficacy for learning online predicted numbers of assignments submitted. Students in a larger section submitted fewer assignments and had lower exam grades than those in a smaller section. Few other factors explained variation in assignment submissions or exam grades. The findings elucidate differences in how students experienced the abrupt transition to remote instruction, thus informing efforts toward more equitable access to education in the midst of an ongoing crisis.
College students, and adults in general, may find it hard to identify trustworthy information amid the proliferation of false news and misinformation about the COVID-19 pandemic. In Fall 2020, college students (N = 221) in an online general education civics course were taught through asynchronous assignments how to use lateral reading strategies to fact-check online information. Students improved from pretest to posttest in the use of lateral reading to fact-check information; lateral reading was predicted by the number of assignments completed and students’ reading comprehension test scores. Students reported greater use, endorsement, and knowledge of Wikipedia at posttest, aligning with the curriculum’s emphasis on using Wikipedia to investigate information sources. Students also reported increased confidence in their ability to fact-check COVID-19 news. While confidence was related to perceived helpfulness of the assignments, it was only weakly associated with lateral reading. Findings support the effectiveness of the online curriculum for improving fact-checking.
This study investigated factors contributing to Introductory Psychology students' success in remote online learning during the coronavirus disease (COVID-19) pandemic. Building on the composite persistence model, we used learning outcomes assessment data to examine student characteristics (demographics), skills, and internal and external factors as predictors of performance of diverse students (N = 1,270) enrolled at an open-enrollment, Hispanicserving institution in Fall 2020. In keeping with prepandemic national trends, Hispanic/ Latinx and Black/African American students performed worse across outcomes (pass/fail, homework submission, quiz scores, and test grades), as did males on most outcomes. Reading comprehension skill predicted all outcomes over and above student characteristics. For internal factors, greater perceived difficulty of transitioning to online learning adversely impacted most outcomes. Self-reported digital multitasking was unrelated to outcomes; only 25% of students identified difficulties with attention and motivation as a learning challenge. For external factors, the use of handheld devices to complete homework was associated with worse outcomes. About half (49%) identified digital access as a challenge, though identifying this challenge was unrelated to outcomes. Additionally, students in larger sections (≥119 students) performed worse. Students who noted specific challenges (e.g., digital access, disruptive environments) at the start of the semester tended to be from demographic groups (e.g., females) with better outcomes. Cumulatively, these factors accounted for only 10.4%-16.5% of variance in course outcomes, suggesting that other factors beyond the present study's scope (e.g., student, family, and community mental and physical health) may have also impacted learning during the pandemic.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.