2011
DOI: 10.1353/pla.2011.0035
|View full text |Cite
|
Sign up to set email alerts
|

All Together Now: Getting Faculty, Administrators, and Staff Engaged in Information Literacy Assessment

Abstract: Trinity University has established effective strategies for engaging faculty, administrators, and staff in information literacy instruction and assessment. Succeeding in an area in which many libraries struggle, the Coates Library at Trinity University offers a model for libraries seeking to actively engage their campuses through 1) establishing a common definition of information literacy; 2) developing workshops and grants; and 3) engaging in campus-wide information literacy assessment using rubrics. Furtherm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
20
0
2

Year Published

2012
2012
2019
2019

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 41 publications
(23 citation statements)
references
References 17 publications
1
20
0
2
Order By: Relevance
“…A rubric (Appendix B) was used to measure the open-ended questions, but with limited experience in designing and using rubrics a review of the literature was a necessary first step (Brown, 2008;Crowe, 2010;Daniels, 2010;Diller & Phelps, 2008;Fagerheim & Shrode, 2009;Gardner & Acosta, 2010;Knight, 2006;Oakleaf, 2008Oakleaf, , 2009aOakleaf, , 2009bOakleaf, Millet & Kraus, 2011). In the rubric design, aligning the criteria to the objectives of the first-year information literacy curriculum provided the framework within which to craft the measures.…”
Section: Methodsmentioning
confidence: 99%
“…A rubric (Appendix B) was used to measure the open-ended questions, but with limited experience in designing and using rubrics a review of the literature was a necessary first step (Brown, 2008;Crowe, 2010;Daniels, 2010;Diller & Phelps, 2008;Fagerheim & Shrode, 2009;Gardner & Acosta, 2010;Knight, 2006;Oakleaf, 2008Oakleaf, , 2009aOakleaf, , 2009bOakleaf, Millet & Kraus, 2011). In the rubric design, aligning the criteria to the objectives of the first-year information literacy curriculum provided the framework within which to craft the measures.…”
Section: Methodsmentioning
confidence: 99%
“…22 However, this body of research presents many challenges: the large number of instruction delivery formats (one-shot sessions, for-credit courses, online tutorials) and methodologies (standardized tools, citation analyses, pretest/posttest, self-assessment and user-satisfaction surveys), as well as the validity and reliability of the instruments. 23 As Sobel and Sugimoto put it, Many studies … are focused on change; that is, an increase in scores from one instance of testing to the next …. The time between testing can vary, from the start and finish of a one-hour instruction session, to the start and finish of an undergraduate career.…”
Section: Instruction Effectsmentioning
confidence: 99%
“…[A]lthough these measures (e.g., multiple choice, true/false) can be used to establish benchmarks of knowledge or to provide a snapshot of performance at a certain point in a student's academic career, they are not necessarily linked to performance objectives, and do not demonstrate how well a student has actually learned to navigate through a search strategy process to find, evaluate, use, and apply information to meet a specific need, (p. 193) In another article relating to Trinity's IL related QEP, Oakleaf, Millet, and Kraus (2011), discussed their process of developing rubrics to assess student IL. They also claimed there is little or no literature on collaborative assessment of IL.…”
Section: Il Assessmentmentioning
confidence: 99%