2016
DOI: 10.1109/tse.2015.2467378
|View full text |Cite
|
Sign up to set email alerts
|

Crossover Designs in Software Engineering Experiments: Benefits and Perils

Abstract: In experiments with crossover design subjects apply more than one treatment. Crossover designs are widespread in software engineering experimentation: they require fewer subjects and control the variability among subjects. However, some researchers disapprove of crossover designs. The main criticisms are: the carryover threat and its troublesome analysis.Carryover is the persistence of the effect of one treatment when another treatment is applied later. It may invalidate the results of an experiment. Additiona… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
87
0
9

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 102 publications
(96 citation statements)
references
References 31 publications
0
87
0
9
Order By: Relevance
“…In the context of experiments and quasiexperiments, Vegas et al [46] reviewed 39 papers using crossover designs (which are a form of repeated measures design) and found 58% of the papers did not use an analysis method consistent with the design, which could "compromise the validity of the findings". Papers that used the invalid analysis are valueless scientifically, unless their raw data is available for re-analysis.…”
Section: Problems With Empirical Software Engineering Practicementioning
confidence: 99%
See 1 more Smart Citation
“…In the context of experiments and quasiexperiments, Vegas et al [46] reviewed 39 papers using crossover designs (which are a form of repeated measures design) and found 58% of the papers did not use an analysis method consistent with the design, which could "compromise the validity of the findings". Papers that used the invalid analysis are valueless scientifically, unless their raw data is available for re-analysis.…”
Section: Problems With Empirical Software Engineering Practicementioning
confidence: 99%
“…It can only ensure that the data and analysis methods are available for inspection and that the results presented in the paper can be derived from the data and analysis procedures. However, if it were adopted, we hope that design and analysis errors, such as those reported by Vegas et al [46] and Shepperd et al [39] would be more likely to be uncovered, hopefully, prior to publication during the review process or soon after publication as the full details will be available to all interested readers. Furthermore, it would provide a valuable resource for training novice researchers.…”
Section: Advantages Of Reproducible Researchmentioning
confidence: 99%
“…e lack of information about the existence of such problems in SE is rather surprising. e SE methodological literature has not widely addressed this topic; only some works [23,41,52,81] have scratched the surface. Researchers may not be aware of the existence of statistical errors, much less their prevalence and potential impact.…”
Section: Discussionmentioning
confidence: 99%
“…When we searched for SE papers related to statistical problems, we only found the following results: Dybå et al 's paper regarding statistical power [23], Miller's paper on meta-analysis [52], and two papers by Kitchenham [41] and Vegas et al [81] that focused on within-subject designs.…”
Section: Statistical Errors In Sementioning
confidence: 99%
See 1 more Smart Citation