2016
DOI: 10.1126/science.aad9163
|View full text |Cite
|
Sign up to set email alerts
|

Response to Comment on “Estimating the reproducibility of psychological science”

Abstract: Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively interpreted, correlational data. Using the Reproducibility Project: Psychology data, both optimistic and pessimistic conclusions about reproducibility are possible, and neither are yet warranted.A cross multiple indicators of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
94
0
2

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 164 publications
(99 citation statements)
references
References 9 publications
0
94
0
2
Order By: Relevance
“…However, some people have speculated that the original authors preregistered concern only because they were aware that their studies were relatively weak based on other factors affecting replication (e.g., small effect sizes, underpowered designs) (59). The OSC authors also argued that authors who were less confident of their study's robustness may have been less likely to endorse the replications (47).…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, some people have speculated that the original authors preregistered concern only because they were aware that their studies were relatively weak based on other factors affecting replication (e.g., small effect sizes, underpowered designs) (59). The OSC authors also argued that authors who were less confident of their study's robustness may have been less likely to endorse the replications (47).…”
Section: Resultsmentioning
confidence: 99%
“…For instance, a recent critique of the Reproducibility Project alleged that several replication studies differed significantly from the original studies, undercutting any inferences about lack of reproducibility in psychology (56). The allegation that low-fidelity replication attempts undercut the validity of the Reproducibility Project launched a debate about the role of contextual factors in several replication failures, both in print (47) and in subsequent online commentaries (e.g., refs. 57-59).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…This was also noted by both the original RPP team (Open Science Collaboration, 2015;Anderson, 2016) and in a critique of the RPP (Gilbert, King, Pettigrew, & Wilson, 2016). Replication efforts such as the RPP or the Many Labs project remove publication bias and result in a less biased assessment of the true effect size.…”
Section: Discussionmentioning
confidence: 98%
“…Although these studies suggest substantial evidence of false positives in these fields, replications show considerable variability in resulting effect size estimates (Klein, et al, 2014;Stanley, & Spence, 2014). Therefore caution is warranted when wishing to draw conclusions on the presence of an effect in individual studies (original or replication;Open Science Collaboration, 2015;Gilbert, King, Pettigrew, & Wilson, 2016;Anderson, et al 2016).…”
mentioning
confidence: 99%