2019
DOI: 10.1177/0149206319843985
|View full text |Cite
|
Sign up to set email alerts
|

Play It Again, Sam! An Analysis of Constructive Replication in the Organizational Sciences

Abstract: We wish to express our gratitude to Fred Oswald and two anonymous reviewers for their insightful comments. Work on the article was supported by the University of Melbourne's Faculty of Business and Economics.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

10
174
0
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 137 publications
(201 citation statements)
references
References 97 publications
10
174
0
2
Order By: Relevance
“…After an initial coding attempt with ill-defined coding criteria had led to too many disagreements (described further in the Appendix), we developed the following strategy (not pre-registered): We searched the full texts of all papers for the string 'replic * ' (cf. Makel, Plucker, & Hegarty, 2012;Köhler & Cortina, 2019;Mueller-Langer, Fecher, Harhoff, & Wagner, 2019;Pridemore, Makel, & Plucker, 2018) and, for papers that did contain it, determined whether the coded hypothesis was a close replication with the goal to verify a previously published result. Conceptual replications and internal replications (replication of a study in the same paper) were not counted as replications in this narrow sense, since both are more likely to be motivated by the goal to build on previous work than by scepticism.…”
Section: Measures and Coding Proceduresmentioning
confidence: 99%
“…After an initial coding attempt with ill-defined coding criteria had led to too many disagreements (described further in the Appendix), we developed the following strategy (not pre-registered): We searched the full texts of all papers for the string 'replic * ' (cf. Makel, Plucker, & Hegarty, 2012;Köhler & Cortina, 2019;Mueller-Langer, Fecher, Harhoff, & Wagner, 2019;Pridemore, Makel, & Plucker, 2018) and, for papers that did contain it, determined whether the coded hypothesis was a close replication with the goal to verify a previously published result. Conceptual replications and internal replications (replication of a study in the same paper) were not counted as replications in this narrow sense, since both are more likely to be motivated by the goal to build on previous work than by scepticism.…”
Section: Measures and Coding Proceduresmentioning
confidence: 99%
“…One of them is how to define the universe of all possible operationalizations of a concept (classical operationalism limits the meaning of a concept to established operations), which is actually a problem more intractable than it first appears to be. For example, it might not be ideal to include a measure that is known for its poor psychometric qualities in that universe just because of its connection to the concept (Köhler & Cortina, 2021). Or we can always (and often do) imagine that future researchers will come up with a much better, previously unthought of measure of a concept that would clearly win out over its existing alternatives (you may think of Popper's black swan in terms of measurement).…”
Section: On Random Effects)mentioning
confidence: 99%
“…Don Hambrick, one of the pioneers of UET, recently argued that these conditions present unique opportunities for researchers: to advise leaders on how to confront a shifting environment and to leverage our existing UET research platform to take on broader questions (Hambrick, 2019). Yet fundamental issues are also now being raised regarding the transparency, replicability, and applicability of social science research, including management research (e.g., Köhler & Cortina, in press; Rynes, Colbert, & O’Boyle, 2018). So while the relevance of the UET perspective may be increasing, so are the challenges of executing sound studies—underscoring the importance of addressing the metacritiques we examine in this review.…”
mentioning
confidence: 99%