Search SPR
Advanced Search
TRAINING AND CAREER
SAMPLE SYLLABI IN PREVENTION SCIENCE

Return to Syllabi List

PM 536: Program Evaluation

Master of Public Health Program
Department of Preventive Medicine
Keck School of Medicine
University of Southern California

Professor: Thomas Valente, PhD
1000 South Fremont Ave., Bldg. A, Rm. 5133
phone: (626) 457-6678
fax: (626) 457-6699
email: tvalente@usc.edu

Time: Tuesdays, 12:30-4:00 pm
Location: Alhambra campus (HSA) Room 7059

Course Description

The course examines the concepts, tools, data collection and analysis methods, and designs used to evaluate health promotion programs. Examples come from domestic and international substance abuse prevention programs, family planning, and reproductive health programs. The goal of the course is to enable students to conduct competent and interesting evaluations of health-related programs.

The course consists of reading materials, class discussions, computer and data analysis assignments, and two exams. Data analysis will be conducted using SAS statistical package available on campus computers and data will be provided from the instructor's existing projects.

Learning Objectives: Students who complete this course will be able to:

1. Understand evaluation research and the academic literature it generates.

2. Conduct project evaluations on their own and in collaboration with other individuals and organizations.

3. Know how to design an evaluation including questionnaire construction, sampling frames, data collection, data management and analysis.

4. Understand the relationship between evaluation and program development and modification.

5. Develop a stronger appreciation for the rigors of evaluation research; the difficulties of data analysis and proper impact interpretation; and the opportunities for theory construction within an evaluation framework.

Required Texts

1. Valente, T. W. (2002). Evaluating Health Promotion Programs. Oxford University Press.

2. Supplementary readings to be purchased at bookstore.

3. Copies of Lecture Overheads to be distributed in class.

4. Lab Assignments to be distributed in class.

Exams & Assignments Proportion of Grade

1) 12 Computer Lab Assignments (due at the beginning of class day indicated) @ 4% 36 %
(Assignments submitted after class has begun will have one point deducted.
Those submitted the day after class will have 2 points deducted, two or more
days after 3 points deducted.)

2) Class study task 4 %

3) 2 Exams @ 30% 60 %

Each exam will be conducted in-class and will consist of short-answer questions.

Week by Week Outline

Week 1 (8/26/03). Why and What to Evaluate; Needs assessment: The first week of class addresses issues such as the purposes of evaluation. Why evaluate and for whom? We also then introduce the language, terminology, and some notable frameworks for evaluation. Needs assessments are conducted to determine the nature and scope of the problem the program is addressing and provide information to set goals and objectives.

Valente, Chapter 1 & 2

Fisher et al., Chapters 1 & 2

Week 2 (9/02/03). Behavior change theory: Behavior change theory provides the basis for program goals and objectives, and indicates what concepts should be changed in order to bring about behavior change. The use of theory for setting goals and objectives is also discussed.

Valente, Chapter 3

TBD

Lab 1: "Formative Research" Due next week.

Week 3 (9/09/03). Formative/Process research: Formative research for intervention design and planning. Qualitative techniques for evaluation and researching audience factors. Process research used to understand program implementation by monitoring their implementation.

Valente, Chapter 4 & 5

Ward, V.M., Bertrand, J. T., & Brown, L. F. (1991). The comparability of focus group and survey results: Three case studies. Evaluation Review, 15, 266-283.

Lab 2: "Intro. To SAS/STAT" Short assignment, due next week.

Week 4 (9/16/03). Study Designs: Week four addresses study designs that are the heart and soul of evaluation research. What are study designs and what do they mean? What is the difference between experimental and quasi-experimental designs?

Valente, Chapter 6

Fisher et al., Chapter 7

Bauman, K. E., Viadro, C. I., & Tsui, A. (1994). On the use of true experimental designs for family planning program evaluation: A commentary on merits, problems and solutions. International Family Planning Perspectives, 20(3), 108-113.

Fisher, A., & Carlaw, R. W. (1983). Family planning field research projects: Balancing internal against external validity. Studies in Family Planning, 14, 3-8.

Lab 3: "Experimental designs (appending/merging data)" due next week.

Week 5 (9/23/03). Sample size and power analysis. How to decide what kind of data and how much of it to collect for an evaluation? What are the various sampling strategies used and how do you decide which is best? We introduce and present various sampling strategies and formulas for calculating sample size. What are the steps to adequate sample selection and how does one then collect "good" evaluation data?

Valente, Chapter 7

Fisher et al., Chapter 8-9

Lab 4: "Power Analysis” due next week.

Week 6 (9/30/03). Data collection: Here we cover the procedures for questionnaire construction and instrument design. The evidence on what works and what does not is reviewed. We also cover the various types of questions, their advantages and disadvantages.

Valente, Chapter 8

Lab 5: "Data manipulation & Scale creation (summing, norming, true/false scales)"

Week 7 (10/7/03). Scale construction: Scales provides a way to measure the validity and reliability of measurements, and construct more valid and reliable measures. The procedures for constructing scales are presented.

Valente, Chapter 9

Kumpusalo, E., Neittaanmaki, L, Mattila, K., et al. (1994). Professional Identities of young physicians: A Finnis national survey. Medical Anthropology Quarterly, 8, 69-77

Lab 6: "Factor Analysis”

Week 8 (10/14/03). Statistical analysis: Using statistics to analyze data involves a number steps and processes. This week covers four basic statistical techniques used most frequently for program evaluation. We cover bivariate and multivariate techniques emphasizing their use in determining program impact.

Valente, Chapter 10

Midterm Exam

Lab 7: "Measuring Impact"

Week 9 (10/21/03). Measuring program exposure: Program exposure is a key variable needed to measure program impact. This week we cover program exposure measures by presenting different ways the concept can be measured. Exposure is then used to determine impact.

Valente, Chapter 11

Lab 8: "Exposure Scales"

Week 10 (10/28/03). Impact analysis for cross-sectional studies: How the heck do you analyze data to measure impact? It requires talent, a considerable amount of patience, a flair for the dramatic, and undying optimism. These attributes, a lot of luck and time and absence of catastrophes will enable you to analyze data to determine program impact. This week will cover the steps in the process and the tricks of the trade using datasets from recently and on-going evaluation projects. This focuses on measuring impact using cross-sectional data.

Valente, Chapter 12

Valente, T. W., Kim, Y. M., Lettenmaier, C., Glass, W., & Dibba, Y. (1994). Radio and the promotion of family planning in The Gambia. International Family Perspectives Planning, 20, 97-100.

Bertrand, J. T., Santiso, R., Linder, S. H., & Pineda, M. A. (1987). Evaluation of a communications program to increase adoption of vasectomy in Guatemala. Studies in Family Planning, 18, 361-370.

Lab 9: "Measuring Impact 1 (NRHP data)"

Week 11 (11/04/03). Impact analysis for panel studies: Week 11 covers impact assessment using panel data. Panel data can be used to construct change scores and conduct lagged analysis. Panel data analysis techniques can also be used for cross-sectional data at the community level.

Valente, Chapter 12

Valente, T. W., Poppe, P. R., Alva, M. E., Vera de Briceño, R., & Cases D. (1995). Street theater as a tool to reduce family planning misinformation. International Quarterly of Community Health Education, 15, 279-290.

Valente, T. W., & Bharath, U. (1999). An evaluation of the use of drama to communicate HIV/AIDS information. AIDS Education and Prevention.

Lab 10: "Measuring impact 2 (India - Nalamdana Street Theater Data)"

Week 12 (11/11/03). Advanced statistical analysis: Thus far in this course we have considered experimental and quasi-experimental research designs for evaluating programs. This week we address some other issues and techniques used for evaluation. Path analysis and structural equation modeling provide methodologies for testing measurement and theoretical models simultaneously and the ability to consider multiple dependent variables simultaneously.

Valente, Chapter 13

Valente, T. W., Paredes, P., & Poppe, P. R. (1998). Matching the message to the process: Behavior change models and the KAP gap. Human Communication Research, 24, 366-385.

Parrott, R., Monahan, J. Ainsworth, S. & Steiner, C. (1998). Communicating to Farmers about skin cancer: The behavior adaptation model. Human Communication Research, 24, 386-409.

Lab 11: "Measuring Impact 3"

Week 13 (11/18/03). No Class APHA.

Week 13 (11/25/03). Advanced statistical analysis, part 2: Time series and event history analysis for program evaluation are presented.

Kincaid, D. L., et al. (1996). Impact of a mass media vasectomy promotion campaign in Brazil. International Family Planning Perspectives, 22, 169-175.

Hausman, A. J., Spivak, H. & Prothrow-Stith, D. (1995). Evaluation of a community-based youth violence prevention project. 17, 353-359.

Lab 12: "Measuring Impact 4"

Week 14 (12/2/03). Dissemination & Review: The ultimate objective of many evaluations is to improve the delivery of health-related programs. The process of directly improving program procedures and aiding in program decision-making is sometimes called feedback and here we chose feed-forwarding as an apt description of the process. How do we use our research results to re-plan and revise our programs? How do we publish the results of our efforts?

Valente, Chapter 14

Fisher et al., Chapters 10 & 11

Valente, T. W., & Saba, W. (1998). Mass media and interpersonal influence in a reproductive health communication campaign in Bolivia. Communication Research, 25, 96-124.

Week 15 (12/09/03). Final Exam:

Final Exam

Return to Top


Home | About SPR | Membership | Annual Meeting
Prevention Science | Early Career Network | The Prevention Connection

Society for Prevention Research
11240 Waples Mill Road, Suite 200
Fairfax, VA 22030
P 703-934-4850
f 703-359-7562