2017
DOI: 10.1108/jd-08-2016-0099
|View full text |Cite
|
Sign up to set email alerts
|

A framework for designing retrieval effectiveness studies of library information systems using human relevance assessments

Abstract: Purpose The purpose of this paper is to demonstrate how to apply traditional information retrieval (IR) evaluation methods based on standards from the Text REtrieval Conference and web search evaluation to all types of modern library information systems (LISs) including online public access catalogues, discovery systems, and digital libraries that provide web search features to gather information from heterogeneous sources. Design/methodology/approach The authors apply conventional procedures from IR evaluat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 48 publications
0
6
0
Order By: Relevance
“…One of the studies reviewed for designing this questionnaire is the Mexican Library Association's study that focused on assessing information retrieval from the point of view of users and the extent to which they meet their needs (Behnert & Lewandowski, 2017). Then Beck study (2005), which focused on some important features and characteristics of digital libraries such as: reliability of retrieved information, citation, integrity and objectivity, site maintenance, and the extent of coverage of available information.…”
Section: Methodology Of the Studymentioning
confidence: 99%
“…One of the studies reviewed for designing this questionnaire is the Mexican Library Association's study that focused on assessing information retrieval from the point of view of users and the extent to which they meet their needs (Behnert & Lewandowski, 2017). Then Beck study (2005), which focused on some important features and characteristics of digital libraries such as: reliability of retrieved information, citation, integrity and objectivity, site maintenance, and the extent of coverage of available information.…”
Section: Methodology Of the Studymentioning
confidence: 99%
“…One common alternative to individual assessments of relevancy is to rely on the "wisdom of crowds" either through user studies or via click-through rates, which appears to improve the stability of relevancy judgments (Zhitomirsky-Geffet et al, 2016;Zhitomirsky-Geffet et al, 2018). Studies evaluating the information retrieval of library discovery tools have often relied on user observation in laboratory settings, log analysis or comparison of search results to student citations in particular projects or courses (Behnert and Lewandowski, 2017;Galbreath et al, 2021). None of these options were feasible for this study.…”
Section: Limitationsmentioning
confidence: 99%
“…In usability studies for IR systems, the most used evaluation protocol is to provide the users with a number of information problems and ask them to solve these problems using the search system at hand. A questionnaire is used after the process to assess their satisfaction (Spink, 2002;Behnert and Lewandowski, 2017;Rico et al, 2019).…”
Section: Related Work In Usability Studiesmentioning
confidence: 99%