In recent years evaluators of educational and social programs have expanded their methodological repertoire with designs that include the use of both qualitative and quantitative methods. Such practice, however, needs to be grounded in a theory that can meaningfully guide the design and implementation of mixed-method evaluations. In this study, a mixed-method conceptual framework was developed from the theoretical literature and then refined through an analysis of 57 empirical mixed-method evaluations. Five purposes for mixed-method evaluations are identified in this conceptual framework: triangulation, complementarity, development, initiation, and expansion. For each of the five purposes, a recommended design is also presented in terms of seven relevant design characteristics. These design elements encompass issues about methods, the phenomena under investigation, paradigmatic framework, and criteria for implementation. In the empirical review, common misuse of the term triangulation was apparent in evaluations that stated such a purpose but did not employ an appropriate design. In addition, relatively few evaluations in this review integrated the different method types at the level of data analysis. Strategies for integrated data analysis are among the issues identified as priorities for further mixed-method work.
Four integrative data analysis strategies for mixed-method evaluation designs are derived from and illustrated by empirical practice: data transformation, typology development, extreme case analysis, and data consolidation/merging. The appropriateness of these strategies for different kinds of mixed-method intents is then discussed. Where appropriate, such integrative strategies are encouraged as ways to realize the full potential of mixed-methodological approaches.
Current stances on mixing methods and paradigms are described and critiqued. Ideas are ofleredfor advuncing mijced-method evaluation beyond age-worn paradigm debates.Defining and Describing the Paradigm Issue in Mixed-Method EvaluationThis is an era of methodological pluralism in applied social science, including the field of evaluation. Multiple frameworks for inquiry abound. Interpretivist, postpositivist, activist, literary, feminist, and critical frameworks, among others, compete for our attention and allegiance. The dissonance and discord created by such competition (see Lincoln, 1991;Sechrest, 1992) are softened, to a degree, by continuing endeavors to embrace multiple methodologies within the same study or the same inquiry project (among many other works, see Brewer and Hunter, 1989;Bryman, 1988; Cook, 1985;Firestone, 1990;Fishman, 1991;Howe, 1985Howe, , 1988Mark and Shotland, 1987;Maxwell, 1996; Reichardt and Rallis, 1994;Shadish, 1995). The work presented in this volume supports and advances these endeavors. Indeed, its premise is that using multiple and diverse methods is a good idea, but is not automatically good science. Rather, just as survey research, quasi-experimentation, panel studies, and case studies require careful planning and thoughtful decisions, so do mixedmethod studies. Lacking justification and planning, mixed-method decisions may not be defensible.Yet, just what is required for planned, defensible mixed-method decisions in evaluative inquiry? As in other inquiry logics and frameworks, what is required is thoughtful guidance at three important levels of inquiry decision making:1. The political level, or the level of purpose, which encompasses the broad, value-based questions about the purpose and role of evaluation in society 2. The philosophical level, or the level of paradigm, which incorporates assumptions and stances about the social world and our ability to know it
The field of evaluation has shifted away from parochial debates and toward more ecumenical perspectives that seriously consider the potential of multiple, diverse inquiry methodoIogies to inform the purpose and practice of evaluative inquiry. Chelirnsky (1997) identifies a superordinate goal for evaluation as it moves into the next century and into a more global context: the worldwide evaluation community must help respond to the urgent socioeconomic, political, and infrastructure needs that changing political and policy contexts have newly created. Chelimsky argues that part of our response musr involve confronting two long-standing tensions in the evaluation field. The first concerns the different purposes for evaluation and the different methodological emphases they imply; the second concerns different views about the use of evaluation and different conceptions of the evaluator's role. To address our methodological problems, Chelimsky recommends an extended vision of evaluation, more skepticism about our methods, and the initiation of a "constructive dialogue in which we seek to correct weaknesses, not exacerbate them" (p. 23).Chapter One of this volume builds on a broad, ecumenical view of methods to initiate what we hope will be a "constructive dialog" about the nature and influence of paradigms in framing mixed-method approaches to evaluation. The key tenet advanced is the need to move beyond debating paradigmatic differences that may well be irreconcilable and to focus instead on joining the critical features of our evaluative claims that represent distinct traditions. Joining such critical features can help to generate more relevant, useful, and discerning inferences. In this chapter, we relate the conceptual ideas advanced in Chapter One to several practical mixedmethod design alternatives, accompanied by examples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.