2019
DOI: 10.1007/s11229-019-02456-7
|View full text |Cite
|
Sign up to set email alerts
|

Flexible yet fair: blinding analyses in experimental psychology

Abstract: The replicability of findings in experimental psychology can be improved by distinguishing sharply between hypothesis-generating research and hypothesis-testing research. This distinction can be achieved by preregistration, a method that has recently attracted widespread attention. Although preregistration is fair in the sense that it inoculates researchers against hindsight bias and confirmation bias, preregistration does not allow researchers to analyze the data flexibly without the analysis being demoted to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
34
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 32 publications
(34 citation statements)
references
References 79 publications
0
34
0
Order By: Relevance
“…The second new standard is whether the journal offers open science badges (i.e., optional visible symbols added to articles that meet some criterion, 10 Some have also warned about author bias and proposed methods of blind data analysis adopted in the hard sciences for countering it (MacCoun & Perlmutter 2015). These methods can be especially useful when replication is not feasible (MacCoun 2018) or when researchers analyze data in ways that go beyond pre-registered plans (Dutilh et al 2019). such as posting data or materials online, see Kidwell et al 2016, Rowhani-Farid & Barnett 2018).…”
Section: Responses To Irreplicabilitymentioning
confidence: 99%
“…The second new standard is whether the journal offers open science badges (i.e., optional visible symbols added to articles that meet some criterion, 10 Some have also warned about author bias and proposed methods of blind data analysis adopted in the hard sciences for countering it (MacCoun & Perlmutter 2015). These methods can be especially useful when replication is not feasible (MacCoun 2018) or when researchers analyze data in ways that go beyond pre-registered plans (Dutilh et al 2019). such as posting data or materials online, see Kidwell et al 2016, Rowhani-Farid & Barnett 2018).…”
Section: Responses To Irreplicabilitymentioning
confidence: 99%
“…Romero and Sprenger conclude that the choice of the statistical framework plays an important role in increasing the reliability of published research and favour the Bayesian approach over the frequentist paradigm. Dutilh et al (2021) consider the implications of preregistration of statistical analyses in the face of the replication crisis in the sciences. They argue that while preregistration is a powerful and increasingly popular method to raise the reliability of empirical results, it imposes an unwelcome lack of flexibility of statistical analysis.…”
Section: Formal Epistemology and The Replication Crisismentioning
confidence: 99%
“…Our call for expression of interest received a great variety of promised manuscripts, the variety is reflected in the published papers. They range from fields far away from our own interests such as quantum probabilities de Ronde et al (2021), evolvable software systems Primiero et al (2021), to topics closer to our own research in the philosophy of medicine Lalumera et al (2020), psychology Dutilh et al (2021), traditional epistemology Dunn (2021); Tolly (2021) to finally close shared interests in formal epistemology Romero and Sprenger (2021) even within our own department Merdes et al (2021).…”
mentioning
confidence: 99%
“…Similarly, it is possible to blind yourself to real effects in the data by having someone relabel the treatment levels so you cannot link them to the treatment levels anymore. These and other methods of data blinding are clearly described by Dutilh, Sarafoglou, and Wagenmakers (2019). Q14: What do you know about missing data in the dataset (i.e., overall missingness rate, information about differential dropout)?…”
Section: Number Of Siblings (Covariate)mentioning
confidence: 99%
“…If you provide an estimate, try to be conservative and pick the lowest sample size of the possible options. If it is impossible to provide an estimate, you could also mask the data (see Dutilh, Sarafoglou, & Wagenmakers, 2019 Please also provide a new expected sample size. Note that this will be the definitive expected sample size for your study and you will use this number to do any power analyses.…”
Section: Number Of Siblings (Covariate)mentioning
confidence: 99%