2008
DOI: 10.1016/j.ipm.2007.01.024
|View full text |Cite
|
Sign up to set email alerts
|

On the role of user-centred evaluation in the advancement of interactive information retrieval

Abstract: This paper discusses the role of user-centred evaluations as an essential method for researching interactive information retrieval. It draws mainly on the work carried out during the Clarity Project where different user-centred evaluations were used during the lifecycle of a cross-language information retrieval system. The iterative testing was not only instrumental to the development of a usable system, but it enhanced our knowledge of the potential, impact, and actual use of cross-language information retrie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2009
2009
2019
2019

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 35 publications
(28 citation statements)
references
References 42 publications
0
26
0
Order By: Relevance
“…As first step we want to exploit benchmarks to evaluate detailed implementation solutions, like, for example, different algorithms to assess the relevance of tags for situations and resources. After that, we plan to apply an IIR evaluation methodology, involving users in a controlled environments, following the ideas presented [1,10]. Finally a broader user-centred evaluation will help us to understand if the sCAB is effective in the real world.…”
Section: Discussionmentioning
confidence: 99%
“…As first step we want to exploit benchmarks to evaluate detailed implementation solutions, like, for example, different algorithms to assess the relevance of tags for situations and resources. After that, we plan to apply an IIR evaluation methodology, involving users in a controlled environments, following the ideas presented [1,10]. Finally a broader user-centred evaluation will help us to understand if the sCAB is effective in the real world.…”
Section: Discussionmentioning
confidence: 99%
“…Evaluation from a user-oriented perspective is important in going beyond retrieval effectiveness to assess retrieval performance, for example assessing user satisfaction with the results, usability of the interface, whether users are engaged with the system, user performance with a task and the effects of changes in the retrieval system on user behaviour (Tague and Schultz, 1989;Saracevic, 1995;Harter and Hert, 1997;Su, 1992;Saracevic, 1995;Voorhees, 2002;Ingwersen and Järvelin, 2005). This requires going beyond the traditional Cranfield-style IR experiment and various studies have been carried out from an Interactive Information Retrieval (IIR) perspective, such as those in TREC in the 1990s (Over, 2001;Kelly and Lin, 2007) along with many others (Su, 1992;Dunlop, 1996;Koenemann and Belkin, 1996;Xie, 2003;Petrelli, 2008;Hearst, 2009).…”
Section: User-oriented Evaluationmentioning
confidence: 99%
“…In the latter, the experiment takes place in the real operational/natural setting where the IR system is typically used. There are known advantages and disadvantages and important issues acknowledged for each setting and therefore should be considered by the researcher for a careful choice (Tague-sutcliffe, 1992;Robertson, 1981;Borlund and Ingwersen, 1997;Petrelli, 2008). First, the laboratory setting offers more control of independent variables that may affect the outcomes of the experiment.…”
Section: Lab-based Versus Naturalistic Settingsmentioning
confidence: 99%
“…Another area where there is quite a lot of systems-centered research, but not much user-centered research is cross-language retrieval, although several researchers have made contributions to this area [204,211]. Cross-language retrieval is not as widespread and common as the other types of retrieval discussed in the preceding paragraphs, so it is hard to identify search tasks and contexts.…”
Section: Other Types Ofmentioning
confidence: 99%