2009
DOI: 10.1016/j.evalprogplan.2009.01.001
|View full text |Cite
|
Sign up to set email alerts
|

Planning and implementation of a participatory evaluation strategy: A viable approach in the evaluation of community-based participatory programs addressing cancer disparities

Abstract: Community-based participatory research (CBPR) has been posited as a promising methodology to address health concerns at the community level, including cancer disparities. However, the major criticism to this approach is the lack of scientific grounded evaluation methods to assess development and implementation of this type of research. This paper describes the process of development and implementation of a participatory evaluation framework within a CBPR program to reduce breast, cervical, and colorectal cance… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
33
0
3

Year Published

2010
2010
2016
2016

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 42 publications
(37 citation statements)
references
References 20 publications
1
33
0
3
Order By: Relevance
“…2527 Participants were asked to state their agreement (i.e., “I disagree,” “I am not sure,” “I agree”) with each of four statements, such as “Pap tests are important for a woman your age.” Negatively stated items were reverse coded, and a mean score was calculated, with a higher score indicating a more positive attitude.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…2527 Participants were asked to state their agreement (i.e., “I disagree,” “I am not sure,” “I agree”) with each of four statements, such as “Pap tests are important for a woman your age.” Negatively stated items were reverse coded, and a mean score was calculated, with a higher score indicating a more positive attitude.…”
Section: Methodsmentioning
confidence: 99%
“…26,27 Participants responded to each item (e.g., “You can get a health care professional to give you a Pap test”), using a three-point scale ranging from “I am not sure” to “I am sure.” The mean of all items was used as the self-efficacy score.…”
Section: Methodsmentioning
confidence: 99%
“…We have continued to improve the evaluation process over time, and now use a modified RE-AIM (Reach, Effectiveness, Adoption, Implementation, Maintenance) framework (Glasgow, Klesges, Dzewaltowski, Estabrooks, & Vogt, 2006) to guide our evaluation. A logic model (Fielden et al, 2007; Sanchez, Carrillo, & Wallerstein, 2011; Sandoval et al, 2012; Scarinci, Johnson, Hardy, Marron, & Partridge, 2009) in Figure 1 summarizes the overall plan.

Reach : To evaluate reach of the CES-P, we monitor (a) number/types of participants that inquire about the program (via phone, e-mail, and/or information sessions) and their representative organizations; (b) number/types of participants that apply for the program and their representative organizations; and (c) types of participants that are selected, including organizations, areas of health interest, experience in CBPR, and previous history/ experience of the CBPR partnerships.

Effectiveness : To evaluate effectiveness of the CES-P, we use standardized evaluation tools for each training session (content, expertise of speakers, usefulness, etc.

…”
Section: Program Evaluationmentioning
confidence: 99%
“…Community participation is of particular value in an established participatory evaluation framework (Scarinci et al, 2009). Saunders et al (Saunders et al, 2005) define basic elements of a comprehensive evaluation as fidelity, dose delivered, dose received, reach, recruitment and context, which measure the degree to which a programme has been delivered as intended.…”
Section: Cbpr and Related Conceptsmentioning
confidence: 99%