2010
DOI: 10.18438/b85k7t
|View full text |Cite
|
Sign up to set email alerts
|

Learning in Simulations: Examining the Effectiveness of Information Literacy Instruction Using Middle School Students’ Portfolio Products

Abstract: Objective –This study compared the effectiveness of simulation-based and didactic instructional approaches in improving students’ understanding of information literacy (IL) concepts and practices. Methods – The instructional approaches were implemented with two groups of middle school students (i.e., seventh and eighth grades) over a 4-week period. During the implementation period, all students were required to maintain a portfolio of their work. The portfolios were designed to capture students’ … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Although several studies have introduced situational task assessment and portfolio assessment as alternative tools for process evaluation in recent years, the process data collected by the situational task assessment is of single type and coarse granularity, which is relatively inadequate in providing comprehensive evidence reflecting students' digital literacy [20,21]. Establishing scientific and objective evaluation standards for portfolio assessment is challenging; it requires teachers to invest a great deal of time and effort, which limits the application of portfolio assessment in evaluating students' digital literacy [22,23]. Therefore, these two assessment methods could not accurately evaluate students' digital literacy [24].…”
Section: Introductionmentioning
confidence: 99%
“…Although several studies have introduced situational task assessment and portfolio assessment as alternative tools for process evaluation in recent years, the process data collected by the situational task assessment is of single type and coarse granularity, which is relatively inadequate in providing comprehensive evidence reflecting students' digital literacy [20,21]. Establishing scientific and objective evaluation standards for portfolio assessment is challenging; it requires teachers to invest a great deal of time and effort, which limits the application of portfolio assessment in evaluating students' digital literacy [22,23]. Therefore, these two assessment methods could not accurately evaluate students' digital literacy [24].…”
Section: Introductionmentioning
confidence: 99%
“…This finding aligns with Walsh’s (2009) article reviewing assessment methods for information literacy that feature two articles employing simulation, one paper-based and one online. The latter was an online simulation-based information literacy instructional approach used to guide middle school students through the information-seeking process (Newell, 2010). An additional search of the literature revealed two more studies that explicitly used online simulation.…”
Section: Introductionmentioning
confidence: 99%
“…This may be about the "best" type of evidence for a particular question or it may be the "best" type of evidence that can be collected by a particular library at a particular time. For example, EBLIP has published systematic reviews (Koufogiannakis & Weibe, 2006), correlational studies (Eng & Stadler, 2015), quantitative analysis (Newell, 2010), and qualitative studies (Rankin, 2012). Over time as a profession we can look at our evidence base and seek to improve it, but the evidence needs to fit the question and the context and, in the meantime, we need to use the best evidence we can find to help professional decision making.…”
mentioning
confidence: 99%