1996
DOI: 10.1002/(sici)1097-4571(199601)47:1<95::aid-asi9>3.0.co;2-y
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating retrieval performance given database and query characteristics: Analytic determination of performance surfaces

Abstract: An analytic method of information retrieval and filtering evaluation can quantitatively predict the expected number of documents examined in retrieving a relevant document. It also allows researchers and practitioners to qualitatively understand how varying different estimates of query parameter values affects retrieval performance. The incorporation of relevance feedback to increase our knowledge about the parameters of relevant documents and the robustness of parameter estimates is modeled. Single term and t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

1996
1996
2013
2013

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…Based on the binary approach to relevant ranked information objects, Losee [101,102] developed average search length (ASL), to indicate the expected position of a relevant information object in the ranked list of information objects. Based on the binary approach to relevant ranked information objects, Losee [101,102] developed average search length (ASL), to indicate the expected position of a relevant information object in the ranked list of information objects.…”
Section: Performance Measuresmentioning
confidence: 99%
“…Based on the binary approach to relevant ranked information objects, Losee [101,102] developed average search length (ASL), to indicate the expected position of a relevant information object in the ranked list of information objects. Based on the binary approach to relevant ranked information objects, Losee [101,102] developed average search length (ASL), to indicate the expected position of a relevant information object in the ranked list of information objects.…”
Section: Performance Measuresmentioning
confidence: 99%
“…The experimental performance of retrieval systems that use terms clustered into windows has been studied (Akers, 1995;Haas & He, 1993;Haas & Losee, 1994;Losee, 1994). Performance of retrieval systems using groups of terms that are statistically related may be computed analytically rather than experimentally (Losee, 1995a(Losee, , 1996a.…”
Section: Windows and Phrasesmentioning
confidence: 99%
“…We hope to continue the study of syntactic groupings for the automatic identi cation of key terms and phrases in documents, as well as to isolate those subject bearing parts of documents for use in systems that summarize documents. Additionally, allowing us to describe more important parts of documents will allow us to focus the analytic study of retrieval and ltering system performance (Losee, 1988(Losee, , 1995a(Losee, , 1996a on those parameters likely to have the greatest impact on performance.…”
Section: Implications Of Research and Summarymentioning
confidence: 99%