2016
DOI: 10.1002/ev.20215
|View full text |Cite
|
Sign up to set email alerts
|

Editor's Notes Social Experiments in Practice: Introduction, Framing, and Context

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
5
0

Year Published

2016
2016
2019
2019

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 15 publications
0
5
0
Order By: Relevance
“…The criticism that social experiments cannot "look inside the black box" to determine which elements of an intervention generate its success ignores design and analysis alternatives capable of revealing which components of multi-faceted interventions lead to their success. For further elaboration of "black box" opening designs and analyses, see the recent special section on this topic, published in the American Journal of Evaluation volume 36, issue 4 (2015) and large portions of issue 152 of New Directions in Evaluation on Social Experiments in Practice (Peck, 2016b).…”
Section: Concern #11: Experiments Do Not Inform Questions Of Program ...mentioning
confidence: 99%
“…The criticism that social experiments cannot "look inside the black box" to determine which elements of an intervention generate its success ignores design and analysis alternatives capable of revealing which components of multi-faceted interventions lead to their success. For further elaboration of "black box" opening designs and analyses, see the recent special section on this topic, published in the American Journal of Evaluation volume 36, issue 4 (2015) and large portions of issue 152 of New Directions in Evaluation on Social Experiments in Practice (Peck, 2016b).…”
Section: Concern #11: Experiments Do Not Inform Questions Of Program ...mentioning
confidence: 99%
“…Randomized control trials (RCTs) have assumed a prominent role in social science research in the past two decades, including in education (Spybrook, 2013) and their use and applications are increasing (Peck and Goldstein, 2016). This is especially true in the United States, where federal funding of educational research programs is often contingent on building-in a summative program evaluation that assesses program impact using methods that limit potential for selection bias in results.…”
Section: Introductionmentioning
confidence: 99%
“…Social experiments are used widely to address questions of program efficacy. The Randomized Experiment eJournal counts 2000 + entries since its inception in 2007 (Peck and Goldstein, 2016). In the context of educational research in the United States, between 2002 and 2013, the Institute of Education Sciences funded more than 100 cluster randomized trials (Spybrook, 2013).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Particularly, the use of randomized controlled trials (RCTs) comes with many challenges, including: (a) ethical issues, such as the need to serve all population members at risk which prevents the use of a counterfactual in a program's evaluation; (b) timing issues, such as the urgency to act quickly and flexibly to deliver—and measure the impact of—an intervention response to an immediate risk or identified time‐sensitive programmatic need, thus, there is often not sufficient time to plan and execute a RCT or other rigorous design; (c) financial and/or technical resources, to implement the often resource‐intensive RCT; and (d) generalizability issues, such as implementing a rigorous impact evaluation that will yield results that cannot be extended beyond the experimental study context. Issues such as these have led to a call across the education intervention, human services programming, and international development literature to continue examining the suitability of alternative research designs for investigating the impact of interventions (e.g., Deterding & Solmeyer, ; Peck & Goldstein, ; Wynn, Dutta, & Nelson, ).…”
mentioning
confidence: 99%