1970
DOI: 10.1111/j.1939-0025.1970.tb01097.x
|View full text |Cite
|
Sign up to set email alerts
|

Dilemmas in evaluation: Implications for administrators of social action programs.

Abstract: This paper suggests four basic questions that administrators of social action programs should consider in making decisions about program evaluation: Evaluation for what? For whom? By whom? At what cost? In addition, the authors introduce a notion of differential evaluation for different stages of program development.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

1971
1971
2022
2022

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…A primary focus in program evaluation relevance is the immediate utility of the relationship, while research has much less concern for utility, except as a long-term by-product (Guttentag, 1971;Nottingham, 1973;Schulberg, 1972). Some authors adopt a moderate perspective and suggest that experimental methods are simply one of many approaches to program evaluation (Crabbs & Crabbs, 1977;Pine, 1975;Tripodi et al, 1970). Warner (1975aWarner ( , 1975b cautions, for example, that sophisticated research and statistical methods are not the only means of program evaluation.…”
Section: Relevancementioning
confidence: 99%
See 1 more Smart Citation
“…A primary focus in program evaluation relevance is the immediate utility of the relationship, while research has much less concern for utility, except as a long-term by-product (Guttentag, 1971;Nottingham, 1973;Schulberg, 1972). Some authors adopt a moderate perspective and suggest that experimental methods are simply one of many approaches to program evaluation (Crabbs & Crabbs, 1977;Pine, 1975;Tripodi et al, 1970). Warner (1975aWarner ( , 1975b cautions, for example, that sophisticated research and statistical methods are not the only means of program evaluation.…”
Section: Relevancementioning
confidence: 99%
“…Program evaluation utilizes both "hard" and "soft" data (Lorei & Schroeder, 1975). Some of the common methods used in counseling program evaluation procedures to obtain "hard" data include intensive designs (Anton, 1978;Miller & Warner, 1975;Thoresen & Anton, 1973); case studies (Tripodi, Epstein, & MacMurray, 1970;Weiss & Rein, 1970); and epidemiological studies (Tripodi et al, 1970). "Soft" data methods include unobtrusive techniques (Caro, 1971;Cope & Kunce, 1971); satisfaction surveys; status studies; and follow-up questionnaires (Crabbs & Crabbs, 1977;Pine, 1975).…”
mentioning
confidence: 99%