2018
DOI: 10.1145/3239575
|View full text |Cite
|
Sign up to set email alerts
|

OpenSearch

Abstract: We report on our experience with TREC OpenSearch, an online evaluation campaign that enabled researchers to evaluate their experimental retrieval methods using real users of a live website. Specifically, we focus on the task of ad hoc document retrieval within the academic search domain, and work with two search engines, CiteSeerX and SSOAR, that provide us with traffic. We describe our experimental platform, which is based on the living labs methodology, and report on the experimental … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 16 publications
(10 citation statements)
references
References 13 publications
0
10
0
Order By: Relevance
“…Finally, the first living lab for ad-hoc retrieval was held at CLEF in 2015 and was continued in a second iteration in 2016 [83]. The same organizers were also involved in the Open Search track at TREC in 2016 and 2017 [42]. NEWSREEL was the first living lab for real-time news recommendations and ran from 2014 until 2017 [15,41].…”
Section: Living Labsmentioning
confidence: 99%
See 3 more Smart Citations
“…Finally, the first living lab for ad-hoc retrieval was held at CLEF in 2015 and was continued in a second iteration in 2016 [83]. The same organizers were also involved in the Open Search track at TREC in 2016 and 2017 [42]. NEWSREEL was the first living lab for real-time news recommendations and ran from 2014 until 2017 [15,41].…”
Section: Living Labsmentioning
confidence: 99%
“…The infrastructure was tailored explicitly for shared task collaborations and was the backbone of the LiLAS lab at CLEF in 2021. One of the substantial improvements over earlier living lab attempts is the possibility of submitting the entire experimental system instead of submitting pre-computed results only, which addresses the shortcoming of pre-computed results in earlier living labs [42].…”
Section: Living Labsmentioning
confidence: 99%
See 2 more Smart Citations
“…LiLAS offers two different evaluation tasks: Academic ad-hoc retrieval for the multi-lingual and multi-source Life Science search portal LIVIVO and research data recommendation within the Social Science portal GESIS Search. For both tasks, participants are invited to submit -Type A pre-computed runs based on previously compiled queries (ad-hoc search) or documents (research data recommendations) from server logs (comparable to the CLEF LL4IR or TREC Open Search labs [8]) or -Type B Docker containers of full running retrieval/recommendation systems that run within our evaluation framework called STELLA.…”
Section: Evaluation Infrastructure and Submission Typesmentioning
confidence: 99%