Abstract:This methodological review considers science festival evaluation and research studies that have been published in the peer-reviewed literature since 2011, when modern-day science festivals were defined formally. Since that time, the number of science festivals around the world has increased dramatically. The methods and results used to study science festivals are summarized in order to reflect on existing work within this growing sector. The existing literature base is then positioned in relation to recent rec… Show more
“…The studies’ frequent use of evaluation forms is in line with observations of overreliance on audience self-report [ 21 ]. However, the higher quality studies used a wider range of evaluation methods.…”
Section: Discussionmentioning
confidence: 85%
“…Evaluation of the PE impact on children is often mediated by adults (e.g. teachers and parents), and uses different delivery formats, purposes, venues and times compared to adult-orientated PE events [ 21 ]. Studies of mixed populations of families/ children and adults were included if the adult data could be extracted for the synthesis.…”
Section: Methodsmentioning
confidence: 99%
“…With the proliferation of PE activity comes the need to understand how specific types of PE such as festivals work, who they work for and why [ 18 ]. Good quality evaluation of science and health-related festivals, with reflection and learning from current evaluation practice, is therefore essential [ 19 – 21 ]. A previous review of science festival evaluation by Peterman and colleagues [ 21 ] examined the methods and results reported in published science festival evaluations and research.…”
Section: Introductionmentioning
confidence: 99%
“…Good quality evaluation of science and health-related festivals, with reflection and learning from current evaluation practice, is therefore essential [ 19 – 21 ]. A previous review of science festival evaluation by Peterman and colleagues [ 21 ] examined the methods and results reported in published science festival evaluations and research. Their review examined the literature from an expert standpoint within the context of visitor studies and informal science learning, however they did not use systematic review methods, included evaluations published after 2011 only, and excluded studies of individual activities within festivals.…”
Section: Introductionmentioning
confidence: 99%
“…While guidance is available for researchers evaluating a PE event [ 22 , 23 ], PE evaluation efforts have been criticised for poor design, execution, and interpretation [ 20 ], for example, use of a restricted range of evaluation methods [ 21 ], and using evaluation as a token activity to justify funding [ 24 ]. The Queen Mary University of London (QMUL) public engagement evaluation toolkit [ 25 , 26 ] has been developed as an open-access, pragmatic, generic toolkit applicable to diverse forms of academic PE and proposed as a “common ‘evaluation standard’” [ 27 ].…”
The evaluation of public engagement health festivals is of growing importance, but there has been no synthesis of its practice to date. We conducted a systematic review of evidence from the evaluation of health-related public engagement festivals published since 2000 to inform future evaluation. Primary study quality was assessed using the Mixed Methods Appraisal Tool. Extracted data were integrated using narrative synthesis, with evaluation methods compared with the Queen Mary University of London public engagement evaluation toolkit. 407 database records were screened; eight studies of varied methodological quality met the inclusion criteria. Evaluations frequently used questionnaires to collect mixed-methods data. Higher quality studies had specific evaluation aims, used a wider variety of evaluation methods and had independent evaluation teams. Evaluation sample profiles were often gender-biased and not ethnically representative. Patient involvement in event delivery supported learning and engagement. These findings and recommendations can help improve future evaluations. (Research Registry ID reviewregistry1021).
“…The studies’ frequent use of evaluation forms is in line with observations of overreliance on audience self-report [ 21 ]. However, the higher quality studies used a wider range of evaluation methods.…”
Section: Discussionmentioning
confidence: 85%
“…Evaluation of the PE impact on children is often mediated by adults (e.g. teachers and parents), and uses different delivery formats, purposes, venues and times compared to adult-orientated PE events [ 21 ]. Studies of mixed populations of families/ children and adults were included if the adult data could be extracted for the synthesis.…”
Section: Methodsmentioning
confidence: 99%
“…With the proliferation of PE activity comes the need to understand how specific types of PE such as festivals work, who they work for and why [ 18 ]. Good quality evaluation of science and health-related festivals, with reflection and learning from current evaluation practice, is therefore essential [ 19 – 21 ]. A previous review of science festival evaluation by Peterman and colleagues [ 21 ] examined the methods and results reported in published science festival evaluations and research.…”
Section: Introductionmentioning
confidence: 99%
“…Good quality evaluation of science and health-related festivals, with reflection and learning from current evaluation practice, is therefore essential [ 19 – 21 ]. A previous review of science festival evaluation by Peterman and colleagues [ 21 ] examined the methods and results reported in published science festival evaluations and research. Their review examined the literature from an expert standpoint within the context of visitor studies and informal science learning, however they did not use systematic review methods, included evaluations published after 2011 only, and excluded studies of individual activities within festivals.…”
Section: Introductionmentioning
confidence: 99%
“…While guidance is available for researchers evaluating a PE event [ 22 , 23 ], PE evaluation efforts have been criticised for poor design, execution, and interpretation [ 20 ], for example, use of a restricted range of evaluation methods [ 21 ], and using evaluation as a token activity to justify funding [ 24 ]. The Queen Mary University of London (QMUL) public engagement evaluation toolkit [ 25 , 26 ] has been developed as an open-access, pragmatic, generic toolkit applicable to diverse forms of academic PE and proposed as a “common ‘evaluation standard’” [ 27 ].…”
The evaluation of public engagement health festivals is of growing importance, but there has been no synthesis of its practice to date. We conducted a systematic review of evidence from the evaluation of health-related public engagement festivals published since 2000 to inform future evaluation. Primary study quality was assessed using the Mixed Methods Appraisal Tool. Extracted data were integrated using narrative synthesis, with evaluation methods compared with the Queen Mary University of London public engagement evaluation toolkit. 407 database records were screened; eight studies of varied methodological quality met the inclusion criteria. Evaluations frequently used questionnaires to collect mixed-methods data. Higher quality studies had specific evaluation aims, used a wider variety of evaluation methods and had independent evaluation teams. Evaluation sample profiles were often gender-biased and not ethnically representative. Patient involvement in event delivery supported learning and engagement. These findings and recommendations can help improve future evaluations. (Research Registry ID reviewregistry1021).
ZusammenfassungEvaluationen bieten einen wichtigen Mehrwert für Wissenschaftskommunikation, denn anhand ihrer Ergebnisse lässt diese sich zukünftig zielorientiert und effektiv gestalten. Zur Zeit steht die Evaluation von Wissenschaftskommunikation in Deutschland allerdings noch vor Herausforderungen. So ergeben sich bereits vor Beginn der Evaluationen Probleme durch fehlende strategische Planung von Wissenschaftskommunikation. Darüber hinaus mangelt es bei Evaluationen oft an passenden Evaluationsdesigns und geeigneten Datenerhebungsmethoden. Zu guter Letzt erschwert das in der deutschen Wissenschaftskommunikationspraxis vorherrschende Bild von Evaluation einen kollektiven und konstruktiven Lernprozess für die Wissenschaftskommunikation. Diese Herausforderungen gilt es zu überwinden, damit Evaluation als kollektiver Reflexionsprozess zur konstruktiven Weiterentwicklung von Wissenschaftskommunikation beitragen kann.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.