There is limited guidance on how to web‐search in systematic reviews and concern relates to the reproducibility of searches using search engines such as Google. The aim of this paper is to address one potential source of variation in Google searches: does the geographical location of a researcher affect Google search returns? Using a virtual private network, we ran the same web‐search for the medical technology Dasatinib in 12 different countries. Two researchers independently extracted the search returns by country organised by page rank. We compared: C1. any difference in the items returned by Google searches between countries and C2. any difference in the page rank of items returned between countries. Searches were undertaken on Monday September 28th 2020. From 12 countries, 43 items were identified. For C1: 19 items were common to all 12 countries. Twenty‐four items were missed by searches in some countries. This means that there were differences in search returns between countries. For C2: a randomised trial reported by Raddich et al was the first search return for all countries. All other items, common to all countries, varied in their page‐rank. We find that geographic location would appear to influence Google search returns based on the findings of this case study. The findings suggest that recording the location of the researcher undertaking web‐searching may now be an important factor to report alongside detail on steps taken to minimise personalisation of web‐searches covered by recent guidance. This finding also has implications for stopping‐rules.
Clinical trials registers form an important part of the search for studies in systematic reviews of intervention effectiveness but the search interfaces and functionality of registers can be challenging to search systematically and resource intensive to search well. We report a technical review of the search interfaces of three leading trials register resources: http://clinicaltrials.gov, the EU Clinical Trials Register and the WHO International Clinical Trials Registers Platform. The technical review used a validated checklist to identify areas where the search interfaces of these trials register resources performed well, where performance was adequate, where performance was poor, and to identify differences between search interfaces. The review found low overall scores for each of the interfaces (http://clinicaltrials.gov 55/165, the EU Clinical Trials Register 25/165, the WHO International Clinical Trials Registers Platform 32/165). This finding suggests a need for joined‐up dialogue between the producers of the registers and researchers who search them via these interfaces. We also set out a series of four proposed changes which might improve the search interfaces. Trials registers are an invaluable resource in systematic reviews of intervention effectiveness. With the continued growth in systematic reviews, and initiatives such as ‘AllTrials’, there is an anticipated need for these resources. We conclude that small changes to the search interfaces, and improved dialogue with providers, might improve the future search functionality of these valuable resources.
Our previous work identified that nine leading guidance documents for seven different types of systematic review advocated the same process of literature searching. We defined and illustrated this process and we named it ‘the Conventional Approach’. The Conventional Approach appears to meet the needs of researchers undertaking literature searches for systematic reviews of clinical interventions. In this article, we report a new and alternate process model of literature searching called ‘A Tailored Approach’. A Tailored Approach is indicated as a search process for complex reviews which do not focus on the evaluation of clinical interventions. The aims of this article are to (1) explain the rationale for, and the theories behind, the design of A Tailored Approach; (2) report the current conceptual illustration of A Tailored Approach and to describe a user’s interaction with the process model; and (3) situate the elements novel to A Tailored Approach (when compared with the Conventional Approach) in the relevant literature. A Tailored Approach suggests investing time at the start of a review, to develop the information needs from the research objectives, and to tailor the search approach to studies or data. Tailored Approaches should be led by the information specialist (librarian) but developed by the research team. The aim is not necessarily to focus on comprehensive retrieval. Further research is indicated to evaluate the use of supplementary search methods, methods of team-working to define search approaches, and to evaluate the use of conceptual models of information retrieval for testing and evaluation.
ObjectiveTo undertake a technical review of the search interface of the ISPOR Presentations Database. By technical review, we mean an evaluation of the technical aspects of the search interface and functionality, which a user must navigate to complete a search.MethodsA validated checklist (Bethel and Rogers, 2014, Health Info Libr J, 31, 43-53) was used to identify where the interface performed well, where the interface was adequate, where the interface performed poorly, where functionality available in core biomedical bibliographic databases does not exist in the ISPOR database, and to establish a list of any issues arising during the review. Two researchers independently undertook the technical review in October 2021.ResultsThe ISPOR database scored 35 of a possible 165 (27/111 essential criteria and 8/54 desirable criteria). Two issues arising were identified, both of which will cause searchers to miss potentially eligible abstracts: (i) that search terms, which include * or ? as truncation or wildcard symbols should not be capitalized (e.g., cost* not Cost*; organi?ation not Organi?ation) and (ii) that quotation marks should be straight sided in phrase searching (e.g., “cost analyses” not “cost analyses”).ConclusionsThe ISPOR database is a promising and free database to identify abstracts/posters presented at ISPOR. We summarize two key issues arising, and we set out proposed changes to the search interface, including: adding the ability to export abstracts to a bibliographic tool, exporting search strategies, adding a researcher account, and updating the help guide. All suggestions will further improve this helpful database.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.