Proceedings of the 20th ACM International Conference on Information and Knowledge Management 2011
DOI: 10.1145/2063576.2063586
|View full text |Cite
|
Sign up to set email alerts
|

Improving retrieval accuracy of difficult queries through generalizing negative document language models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
15
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(16 citation statements)
references
References 37 publications
1
15
0
Order By: Relevance
“…Information retrieval systems still perform badly for some difficult queries [5,6,7,20]. When a query is difficult, users can reformulate the query or look for articles further down the list to give positive relevance feedback (PRF) to try to improve the search results [9,15,16], or users could give negative relevance feedback (NRF), if such possibility is available, typically to the documents that are wrongly ranked highly. Previous studies on NRF have shown that it is generally not as useful as PRF [10], so most work on relevance feedback is on PRF only.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Information retrieval systems still perform badly for some difficult queries [5,6,7,20]. When a query is difficult, users can reformulate the query or look for articles further down the list to give positive relevance feedback (PRF) to try to improve the search results [9,15,16], or users could give negative relevance feedback (NRF), if such possibility is available, typically to the documents that are wrongly ranked highly. Previous studies on NRF have shown that it is generally not as useful as PRF [10], so most work on relevance feedback is on PRF only.…”
Section: Related Workmentioning
confidence: 99%
“…Previous studies have shown with simulations and often artificial queries that negative relevance feedback (NRF) on documents, in the form of binary feedback, can help to improve the search results efficiently when the query is difficult [15,16,19,29,30]. We continue research on NRF using real search tasks with continuous-valued feedback on document keywords.…”
Section: Introductionmentioning
confidence: 96%
See 1 more Smart Citation
“…the relevant documents ]. In [Karimzadehgan and Zhai 2011], a generalized negative language model is developed to penalize/prune the non-relevant documents that are close to the negative language model.…”
Section: Related Workmentioning
confidence: 99%
“…4 It is worth noting at this point that we could have potentially used information induced from the non-relevant documents (F (L f use ) \ Rq) to also improve the query model, Mq;R q , which is used in PoolRank. However, utilizing negative feedback to improve retrieval performance has long been known as an extremely hard task [17,29,37] with the potential merits confined to very difficult queries [37,18].…”
Section: Estimating List Effectivenessmentioning
confidence: 99%