Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions.
Since the early 1990s, ecologists and evolutionary biologists have aggregated primary research using meta-analytic methods to understand ecological and evolutionary phenomena. Meta-analyses can resolve long-standing disputes, dispel spurious claims, and generate new research questions. At their worst, however, meta-analysis publications are wolves in sheep's clothing: subjective with biased conclusions, hidden under coats of objective authority. Conclusions can be rendered unreliable by inappropriate statistical methods, problems with the methods used to select primary research, or problems within the primary research itself. Because of these risks, meta-analyses are increasingly conducted as part of systematic reviews, which use structured, transparent, and reproducible methods to collate and summarise evidence. For readers to determine whether the conclusions from a systematic review or meta-analysis should be trustedand to be able to build upon the reviewauthors need to report what they did, why they did it, and what they found. Complete, transparent, and reproducible reporting is measured by 'reporting quality'. To assess perceptions and standards of reporting quality of systematic reviews and meta-analyses published in ecology and evolutionary biology, we surveyed 208 researchers with relevant experience (as authors, reviewers, or editors), and conducted detailed evaluations of 102 systematic review and meta-analysis papers published between 2010 and 2019. Reporting quality was far below optimal and approximately normally distributed. Measured reporting quality was lower than what the community perceived, particularly for the systematic review methods required to measure trustworthiness. The minority of assessed papers that referenced a guideline (16%) showed substantially higher reporting quality than average, and surveyed researchers showed interest in using a reporting guideline to improve reporting quality. The leading guideline for improving reporting quality of systematic reviews is the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement. Here we unveil an extension of PRISMA to serve the meta-analysis community in ecology and evolutionary biology: PRISMA-EcoEvo (version 1.0). PRISMA-EcoEvo is a checklist of 27 main items that, when applicable, should be reported in systematic review and meta-analysis publications summarising primary research in ecology and evolutionary biology. In this explanation and elaboration document, we provide guidance for authors, reviewers, and editors, with explanations for each item on the checklist, including supplementary examples from published papers. Authors can
1. Publication bias threatens the validity of quantitative evidence from meta-analyses as it results in some findings being overrepresented in meta-analytic datasets because they are published more frequently or sooner (e.g., 'positive' results). Unfortunately, methods to test for the presence of publication bias, or assess its impact on meta-analytic results, are unsuitable for datasets with high heterogeneity and non-independence, as is common in ecology and evolutionary biology.2. We first review both classic and emerging publication bias tests (e.g., funnel plots, Egger's regression, cumulative meta-analysis, fail-safe N, trim-and-fill tests, p-curve and selection models), showing that some tests cannot handle heterogeneity, and, more importantly, none of the methods can deal with non-independence. For each method we estimate current usage in ecology and evolutionary biology, based on a representative sample of 102 meta-analyses published in the last ten years.3. Then, we propose a new method using multilevel meta-regression, which can model both heterogeneity and non-independence, by extending existing regression-based methods (i.e.Egger's regression). We describe how our multilevel meta-regression can test not only publication bias, but also time-lag bias, and how it can be supplemented by residual funnel plots.4. Overall, we provide ecologists and evolutionary biologists with practical recommendations on which methods are appropriate to employ given independent and non-independent effect sizes.No method is ideal, and more simulation studies are required to understand how Type 1 and 2 error rates are impacted by complex data structures. Still, limitations of these methods do not justify ignoring publication bias in ecological and evolutionary meta-analyses.
Rapid environmental change is predicted to compromise population survival, and the resulting strong selective pressure can erode genetic variation, making evolutionary rescue unlikely. Non-genetic inheritance may provide a solution to this problem and help explain the current lack of fit between purely genetic evolutionary models and empirical data. We hypothesize that epigenetic modifications can facilitate evolutionary rescue through ‘epigenetic buffering’. By facilitating the inheritance of novel phenotypic variants that are generated by environmental change—a strategy we call ‘heritable bet hedging’—epigenetic modifications could maintain and increase the evolutionary potential of a population. This process may facilitate genetic adaptation by preserving existing genetic variation, releasing cryptic genetic variation and/or facilitating mutations in functional loci. Although we show that examples of non-genetic inheritance are often maladaptive in the short term, accounting for phenotypic variance and non-adaptive plasticity may reveal important evolutionary implications over longer time scales. We also discuss the possibility that maladaptive epigenetic responses may be due to ‘epigenetic traps’, whereby evolutionarily novel factors (e.g. endocrine disruptors) hack into the existing epigenetic machinery. We stress that more ecologically relevant work on transgenerational epigenetic inheritance is required. Researchers conducting studies on transgenerational environmental effects should report measures of phenotypic variance, so that the possibility of both bet hedging and heritable bet hedging can be assessed. Future empirical and theoretical work is required to assess the relative importance of genetic and epigenetic variation, and their interaction, for evolutionary rescue.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.