While high risk of failure is an inherent part of developing innovative therapies, it can be reduced by adherence to evidence-based rigorous research practices. Numerous analyses conducted to date have clearly identified measures that need to be taken to improve research rigor. Supported through the European Union's Innovative Medicines Initiative, the EQIPD consortium has developed a novel preclinical research quality system that can be applied in both public and private sectors and is free for anyone to use. The EQIPD Quality System was designed to be suited to boost innovation by ensuring the generation of robust and reliable preclinical data while being lean, effective and not becoming a burden that could negatively impact the freedom to explore scientific questions. EQIPD defines research quality as the extent to which research data are fit for their intended use. Fitness, in this context, is defined by the stakeholders, who are the scientists directly involved in the research, but also their funders, sponsors, publishers, research tool manufacturers and collaboration partners such as peers in a multi-site research project. The essence of the EQIPD Quality System is the set of 18 core requirements that can be addressed flexibly, according to user-specific needs and following a user-defined trajectory. The EQIPD Quality System proposes guidance on expectations for quality-related measures, defines criteria for adequate processes (i.e., performance standards) and provides examples of how such measures can be developed and implemented. However, it does not prescribe any pre-determined solutions. EQIPD has also developed tools (for optional use) to support users in implementing the system and assessment services for those research units that successfully implement the quality system and seek formal accreditation. Building upon the feedback from users and continuous improvement, a sustainable EQIPD Quality System will ultimately serve the entire community of scientists conducting non-regulated preclinical research, by helping them generate reliable data that are fit for their intended use.
Pre-clinical models of disease have long played important roles in the advancement of new treatments. However, in traumatic brain injury (TBI), despite the availability of numerous model systems, translation from bench to bedside remains elusive. Integrating clinical relevance into pre-clinical model development is a critical step toward advancing therapies for TBI patients across the spectrum of injury severity. Pre-clinical models include in vivo and ex vivo animal work—both small and large—and in vitro modeling. The wide range of pre-clinical models reflect substantial attempts to replicate multiple aspects of TBI sequelae in humans. Although these models reveal multiple putative mechanisms underlying TBI pathophysiology, failures to translate these findings into successful clinical trials call into question the clinical relevance and applicability of the models. Here, we address the promises and pitfalls of pre-clinical models with the goal of evolving frameworks that will advance translational TBI research across models, injury types, and the heterogenous etiology of pathology.
Laboratory workflows and preclinical models have become increasingly diverse and complex. Confronted with the dilemma of a multitude of information with ambiguous relevance for their specific experiments, scientists run the risk of overlooking critical factors that can influence the planning, conduct and results of studies and that should have been considered a priori. To address this problem, we developed “PEERS” (Platform for the Exchange of Experimental Research Standards), an open-access online platform that is built to aid scientists in determining which experimental factors and variables are most likely to affect the outcome of a specific test, model or assay and therefore ought to be considered during the design, execution and reporting stages. The PEERS database is categorized into in vivo and in vitro experiments and provides lists of factors derived from scientific literature that have been deemed critical for experimentation. The platform is based on a structured and transparent system for rating the strength of evidence related to each identified factor and its relevance for a specific method/model. In this context, the rating procedure will not solely be limited to the PEERS working group but will also allow for a community-based grading of evidence. We here describe a working prototype using the Open Field paradigm in rodents and present the selection of factors specific to each experimental setup and the rating system. PEERS not only offers users the possibility to search for information to facilitate experimental rigor, but also draws on the engagement of the scientific community to actively expand the information contained within the platform. Collectively, by helping scientists search for specific factors relevant to their experiments, and to share experimental knowledge in a standardized manner, PEERS will serve as a collaborative exchange and analysis tool to enhance data validity and robustness as well as the reproducibility of preclinical research. PEERS offers a vetted, independent tool by which to judge the quality of information available on a certain test or model, identifies knowledge gaps and provides guidance on the key methodological considerations that should be prioritized to ensure that preclinical research is conducted to the highest standards and best practice.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.