2016
DOI: 10.1186/s13643-016-0215-7
|View full text |Cite
|
Sign up to set email alerts
|

Comparing the coverage, recall, and precision of searches for 120 systematic reviews in Embase, MEDLINE, and Google Scholar: a prospective study

Abstract: BackgroundPreviously, we reported on the low recall of Google Scholar (GS) for systematic review (SR) searching. Here, we test our conclusions further in a prospective study by comparing the coverage, recall, and precision of SR search strategies previously performed in Embase, MEDLINE, and GS.MethodsThe original search results from Embase and MEDLINE and the first 1000 results of GS for librarian-mediated SR searches were recorded. Once the inclusion-exclusion process for the resulting SR was complete, search… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
100
0
3

Year Published

2016
2016
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 133 publications
(104 citation statements)
references
References 11 publications
(10 reference statements)
1
100
0
3
Order By: Relevance
“…This is also one of few studies [20, 23, 84] that have combined multiple databases using cumulative analysis, thereby accepting what researchers have urged in the past, that searching one database is not enough but investigating what a combined search would yield.…”
Section: Discussionmentioning
confidence: 99%
“…This is also one of few studies [20, 23, 84] that have combined multiple databases using cumulative analysis, thereby accepting what researchers have urged in the past, that searching one database is not enough but investigating what a combined search would yield.…”
Section: Discussionmentioning
confidence: 99%
“…The Bio-TDS query-processing module was evaluated and compared to similar systems using the retrieval rate, precision, recall and F-measure (35). The evaluation was based on a 229 user-query test set relating to 25 analytic tools available in each repository.…”
Section: Resultsmentioning
confidence: 99%
“…This query submitted to Bio-TDS returns 2,495 results with BISMARK ranked eighth. For each tool, we calculate the average of the retrieval rate (RR), Precision (P); recall (R) and F-Measure (F) among the queries (35). The average was then used to calculate the mean average of each repository, which led to the MRR (Mean Retrieval Rate), MAP (Mean Average Precision), MAR (Mean Average Recall) and Mean Average F-Measure (Table 1).…”
Section: Methodsmentioning
confidence: 99%
“…This step is important because recall, defined as the number of studies that are actually identified through search strategies, is usually below the total possible coverage. One study suggests that e.g., in Medline only about three quarters of listed studies could actually be identified by search strategies [20]. Or, if available, we will use the archived search results provided by Cochrane information specialists, who conducted the search for the Cochrane review.…”
Section: Methodsmentioning
confidence: 99%