Contribution analysis is a structured approach to theory-based impact evaluation originally developed in Canada in the context of Results-Based Management (RBM) although there have been few examples of contribution analysis in practice since Mayne's original paper (2001). We argue that contribution analysis adds value to other theory-based evaluation approaches by providing a more structured and rigorous approach to participatory evaluation planning, data analysis and reporting. It can be applied in the context of participatory strategic planning and performance monitoring as well as impact evaluation. Examples are drawn from Scotland and Canada in the performance context of RBM in Canada and Outcomes-Based Accountability (OBA) in Scotland. The authors argue that, as a participatory process, contribution analysis strengthens both conceptual and practical understanding of planning/managing for outcomes and implementation and change theories, thus helping to build collaborative capacity within and across partner organizations. For public managers, the contribution analysis process has a strong appeal and practical value when faced with the task of demonstrating the contribution of single organizations to addressing complex social issues while working in partnership with other public agencies facing multiple accountabilities.
The measurement and evaluation of research, technology and development (RT&D) has gone through phases over the past 50 years. Over time, high-level measures such as total expenditures on R&D, overall citations and patent production have given way to more contextualized metrics recognizing the inherent differences in innovation subject areas and the need to show mission achievement. This article shows how recently proposed Canadian Academy of Health Sciences (CAHS) metrics were adapted to help frame a case study conducted by the Canadian Cancer Society Research Institute (CCSRI). Early results suggest that the framework provides a useful structure to display both a hierarchy of results focused on mission goals, and to build an attributable RT&D and innovation story over time. With this work and other recent developments, evaluation appears poised to go beyond retrospective justification and to become a fully legitimate part of strategic learning for RT&D initiatives.
A brief review of evaluation findings in almost any given domain typically reveals that most and sometimes all major findings deal with the implementation of initiatives—also known as action theory. Moreover, the findings regarding implementation frequently allude to mismatches between the type or level of implementation occurring and the fundamental nature of the initiative. Case examples will illustrate that while all permutations and combinations of change and action theories cannot be summarily assessed, one can use case analysis to draw some lessons to suggest that some combinations are essentially toxic, while others provide at least a reasonable chance of success. The implication is that further systematic coding and analysis of change theories, action theories, and in particular their combinations in programs could produce useful insights for both evaluation and public-policy decision making.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.