2017
DOI: 10.1177/0741932517697447
|View full text |Cite
|
Sign up to set email alerts
|

Publishing Single-Case Research Design Studies That Do Not Demonstrate Experimental Control

Abstract: Demonstration of experimental control is considered a hallmark of high-quality single-case research design (SCRD). Studies that fail to demonstrate experimental control may not be published because researchers are unwilling to submit these papers for publication and journals are unlikely to publish negative results (i.e., the file drawer effect). SCRD studies comprise a large proportion of intervention research in special education. Consequently, the existing body of research, comprised mainly of studies that … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
82
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 69 publications
(84 citation statements)
references
References 52 publications
(84 reference statements)
2
82
0
Order By: Relevance
“…To illustrate this, we found ages varied across studies and, also, while some participants had a sole diagnosis of ASD, others had concomitant diagnoses (e.g., global developmental delay; Gould et al, 2018). To address this limitation, future meta-analyses of RIRD could include procedures for evaluating the differential effects of RIRD according to participant characteristics (e.g., Ganz et al, 2012), or other potential boundary conditions, such as dosage, differing intervention agents, and implementation fidelity (Tincani & Travers, 2018). One promising technique to evaluate the effects of moderating variables or boundaries of treatment for meta-analyses is to employ multilevel models (Becraft et al, in press;Moeyaert et al, 2018).…”
Section: Figurementioning
confidence: 99%
See 1 more Smart Citation
“…To illustrate this, we found ages varied across studies and, also, while some participants had a sole diagnosis of ASD, others had concomitant diagnoses (e.g., global developmental delay; Gould et al, 2018). To address this limitation, future meta-analyses of RIRD could include procedures for evaluating the differential effects of RIRD according to participant characteristics (e.g., Ganz et al, 2012), or other potential boundary conditions, such as dosage, differing intervention agents, and implementation fidelity (Tincani & Travers, 2018). One promising technique to evaluate the effects of moderating variables or boundaries of treatment for meta-analyses is to employ multilevel models (Becraft et al, in press;Moeyaert et al, 2018).…”
Section: Figurementioning
confidence: 99%
“…While publication bias is believed to result from researchers' preference for studies that yield positive findings, there are likely other variables that moderate the degree of bias, including whether the study's findings are interesting or novel, are from a large or wellfunded study, and whether the study is of high methodological quality (Sutton et al, 2000). This latter concern is potentially problematic for the SCD research community, which has regarded experimental control accomplished through unambiguous and strong treatment effects as an intrinsic feature of high-quality research (Cooper et al, 2020;Kilgus et al, 2016;Tincani & Travers, 2018). For example, in their influential paper on quality standards for SCD, Horner et al (2005) asserted that three demonstrations of experimental control were an essential feature of a high-quality experiment.…”
mentioning
confidence: 99%
“…One way to synthesise results across a number of designs is to report a success rate (e.g., percentage of designs in which a functional relation is demonstrated divided by a total number of designs). Of course, this metric has similar problems as other synthesis methods, including meta-analysis-notably that inclusion of only peer-reviewed articles may increase the likelihood of over-estimating success (e.g., non-effects are less likely to be published; Shadish, Zelinsky, Vevea, & Kratochwill, 2016;Tincani & Travers, 2017). To minimise this risk, include grey literature in systematic syntheses (Ledford, Lane, & Tate, 2018).…”
Section: Synthesising Results Within and Across Studiesmentioning
confidence: 99%
“…The reasons for adopting this method is that it is easy to administer and data can be easily collated. This methods were utilize by Haseeb et al, (2019); McKenney and Reeves (2018); Tincani and Travers (2018). the unit of analysis the sample size of the study are the top and funtional managers of flour milling companies in Lagos state nigeria.…”
Section: Methodsmentioning
confidence: 99%