FUNDAMENTALS OF PROGRAM EVALUATION Syllabus
Course Learning Objectives
Upon successfully completing this course, students will be able to:
- Describe the purpose of different types of evaluation
- Design a logic model that explains program impact, based on program activities and objectives
- Develop indicators based on the logic model
- Identify sources of data at the program and population level corresponding to different types of evaluation
- Describe the purpose of formative research and identify the most common methods
- Describe the components of a process evaluation and the most common methods used in a process evaluation
- Describe the elements of experimental and quasi-experimental designs, and explain how they address the threats to validity
- Design an evaluation plan
Familiarizes students in different types of program evaluation, including formative research, process evaluations, monitoring of outputs and outcomes, impact assessment, and cost analysis. Students gain practical experience through a series of exercises involving the design of a project logic model, development of indicators, design of a process evaluation, and selection of a study design for an impact evaluation. Covers experimental, quasi-experimental, and non-experimental study designs, including the strengths and limitations of each.
Intended AudienceMasters and Doctoral Students
Additional Faculty Notes:
This course is designed for students interested in acquiring knowledge and skills required for designing and implementing evaluations of program interventions. Program evaluation is an important component of public health, because it allows managers to identify ways to improve their programs mid-course. Also, it allows evaluators (1) to determine if change occurs consistent with the program objectives, and (2) to establish that the program intervention caused the observed change.
This course provides students at the masters level with the basic skillset required to conduct M&E (monitoring and evaluation) in the context of public health/social development programs. It provides doctoral students with this same set of skills, which combined with other coursework in biostatistics, research methods, and data analysis will allow them to conduct more advanced analyses of program impact.
Methods of Assessment
The student's grade will be based on 4 exercises. In the event that the student's final grade is borderline between two grades (e.g., an A and a B), class participation will favor the student's getting the higher grade.
Additional Faculty Notes:
There are no prerequisites to this course.
There is no required textbook. The required reading will consist of articles and manuals described on the course syllabus.
*Note: One classic evaluation textbook is Rossi, P., M. Lipsey, and H. Freeman. Evaluation: A Systematic Approach. 7th edition 2003: Thousand Oaks, CA; Sage. It is not required reading for this class. Students intending to specialize in program evaluation may want to purchase this textbook and read relevant chapters that parallel the syllabus.
Please see the course Session for a full list of dates and items for this course.
Files from the Online Library
Academic Ethics Code
Students enrolled in the Bloomberg School of Public Health of The Johns Hopkins University assume an obligation to conduct themselves in a manner appropriate to the University's mission as an institution of higher education. A student is obligated to refrain from acts which he or she knows, or under the circumstances has reason to know, impair the academic integrity of the University. Violations of academic integrity include, but are not limited to: cheating; plagiarism; knowingly furnishing false information to any agent of the University for inclusion in the academic record; violation of the rights and welfare of animal or human subjects in research; and misconduct as a member of either School or University committees or recognized groups or organizations.
Welcome to the Fundamentals of Program Evaluation Course! As the name entails, this is a course designed for students with limited or little monitoring and evaluation knowledge and skills. It is meant to serve as a building block for other courses in the department and the school that teach monitoring and evaluation. My philosophy in teaching is that we learn best by doing -- especially when it comes to monitoring and evaluation. Therefore, the class is heavily focused on group work. I like to first teach the concept via lecture and then have students practice that concept with a small group activity. The assignments are meant to reinforce those key concepts and skills.
Dr. Mmari's office hours: Tuesdays 12:30 - 1:30 pm; Fridays, 12:15- 1:15
Contact Information(from old syllabus)
Assignment Descriptions and Guidelines
Each student in the class will prepare an evaluation plan that fulfills the requirements for this class. The plan will contain four parts, each of which represents a separate assignment that counts toward the final grade. The topics to cover in each section of the Evaluation Plan (and the amount each counts toward the final grade) are as follows:
- Description of project, creation of logic model (25%)
- Development of indicators (25%)
- Process evaluation (25%)
- Impact evaluation (25%)
Note: an evaluation plan does not necessarily have to conform to this outline, but we will use this outline as one approach to developing such a plan.
This year, students will be able to choose from three projects to use for all four assignments. If they have a project that they already know well, and has not yet been evaluated, they are welcome to use that as long as it’s not a ‘systems level’ type of project. It must be approved by the instructor, however. It can be based internationally or domestically. There should be a specified target population and behavior (s) or diseases that are being targeted to change. All assignments must be based on the same project; no switching back and forth between projects on the different parts of the assignment.
All assignments should use at least 10 inch fonts (12 inch is preferred for text, while 10 inch could be used for logic models and matrices).
Below are the specific instructions for each assignment. Students should be expected to do their own individual work; no two papers should present identical or highly similar descriptions of the methodology.
Each section will be graded only once. Students may opt to incorporate comments on one assignment into the next version of their evaluation plan (e.g., change the project logic model on feedback received) for their own benefit, but the grade for each segment of the plan will be specific to the segment/assignment in question. Be sure to spell-check it before submission. Write your name on each page of your assignment.
For all assignments, please submit via CoursePlus. The deadlines are:
#1 Logic model Wednesday, February 5 (noon)
#2 Indicators Monday, February 17 (noon)
#3 Process evaluation Friday, February 28 (noon)
#4 Impact evaluation Friday, March 15 (noon)
Assignment #1. Describing the Program and Creating a Logic Model
Purpose of the assignment: to develop the skill of presenting a program or project from an evaluation perspective.
Topics to cover:
- Goal and Objectives of the program (project)
- Description of the program/project intended to achieve the objectives
- Logic model that shows how the program/project is expected to achieve the objectives.
Page limits: One page for intervention description (parts A and B); one page for project logic model (part C).
Instructions and tips:
A. Goal and Objectives:
The task is to learn how to phrase the intended goal and objectives of the program/project in your plan
- Write one goal that the program/project seeks to achieve;
- Write 3 SMART objectives that target the outcomes the project seeks to address
B. Description of the intervention:
This section explains to the reader the different activities that will be carried out with the aim of achieving the program objectives. This information is important for understanding the program and (later) designing the process evaluation. At least three components should be presented as part of the description:
- Need – what’s the public health problem that the program/project is targeting?
- Target population – which individuals, groups, organizations are being targeted for the program/project?
- Activities – what will the project do to reach objectives?
C. Project Logic Model:
The project logic model is the most challenging aspect to this assignment. The following are instructions for completing this part of the assignment:
- On a single page, draw a project logic model to illustrate how the program is expected to achieve its long-term objective(s).
- Draw the model in terms of inputs, activities/processes, outputs, outcomes, and impact
- Don’t include any narrative with the logic model (i.e., paragraphs that explain the model). The figure should be self-explanatory.
- You may discuss this exercise with others in the class; however, the final product must be your own work. (Two identical logic models would be suspect.)
- A computer-generated figure is preferred but not required. You may submit a hand-drawn figure, but the wording must be legible, and must be scanned into electronic format. Please see one of the TAs if you are having difficulty (but please not on the day that the assignment is due).
- Be as clear and concise as possible. (Test your diagram: is it easy to understand the main ideas at a glance?)
Assignment #2. Selecting Indicators
Purpose of the assignment: to translate the inputs, processes, outputs, outcomes, and impact constructs from the project logic model (Assignment #1) into measurable indicators.
Number of indicators: ten indicators, with two indicators for every type of indicator
- Using the project logic model created in the first assignment, develop each input, process, output, outcome (s) and impact into measurable indicators.
- Present the indicators in a table with a self-explanatory title and four columns with the following information:
- Type of indicator (input, process, output, outcome and impact)
- Specific indicator
- Operational definition of any variable or concept that needs further clarification
- Source of data
Type of indicator
Source of data
Existence of funding
Contract, Budget, Grant
No. of radio spots produced per month
Key informant interviews
No. of outlets that sell/provide condoms to youth in catchment area
Percent of commercial sex workers who used a condom at last sex
# of commercial sex workers who used a condom at least sex_____
Total # of commercial sex workers
Survey among commercial sex workers
Neonatal mortality rate
# of neonatal deaths (deaths of live births within the first 28 completed days of life)___________ x 100
# of live births
- The indicator is the concise description of what you intend to measure. The operational definition of the variable or concept explains how you will measure it. Let’s use the concept of stigma. The indicator could be:
- Percent of adults 15-49 years old that stigmatize people living with HIV/AIDS
This indicator clearly explains the concept we are trying to measure, but it doesn’t indicate HOW we are going to measure it. Thus, it is necessary to add an operational definition:
- Stigma will be measured using 4 indicators from the DHS survey; respondents that express negative attitudes toward PLHA on at least two of the four indicators will be classified as stigmatizing people living with HIV/AIDS.
Another example (indicator):
- Percent of young women 15-24 that work outside the home.
- “working outside the home” will be measured as “earning any type of financial remuneration in the past month in return for products or services provided outside the home.”
Note: different studies may have different operational definitions for the same concept, such as “working outside the home.” What is important is your ability to state clearly how you plan to measure the concept in the context of this assignment. The “right” and “wrong” answers relate to the clarity and logic of your explanation, not to some “universally correct answer.” (Note: in fact, some concepts do come to have universally accepted definitions, but this is not the issue in this assignment.)
- Indicators should be succinct, yet details in terms of age, sex, time period, and any other relevant elements (for example, “among males 15-24 within the past 12 months”).
- The indicator should not specify the intended direction of change (“an increase in X; a decrease in Y”). Rather, it should measure the concept or factor you expect to change; for example: the percent of males 15-24 that do X).
- Correct: the percent males 15-24 that used a condom at last casual sex.
- Incorrect: an increase in the percent males 15-24 that used a condom at last casual sex
- Regarding the data source:
- Two indicators may have the same data source.
- Only cite the DHS survey as the data source if you are sure that the indicator you want to measure is included in the standard DHS questionnaire. For example, if the DHS questionnaire does not routinely include questions about exposure to radio spots on knowledge of avian influenza, then it is not an appropriate source of data for this assignment. Instead, you should describe the type of data collection needed to produce this indicator.
- Be sure to indicate the data collection method or instrument used to obtain the information, not the secondary source where one can find it. For example, what type of data collection is needed or used to obtain a measure of mortality from avian influenza (Don’t just cite Ministry of Health statistics; where does the Ministry of Health get this information?).
- If no data source exists, indicate what type of data source will be needed; for example: household survey that includes questions on tobacco use among household members.
Assignment #3. Process Evaluation
Purpose of this assignment: to familiarize students with the importance of writing a process evaluation plan
Part A: Evaluation Matrix (one page)
- Your process evaluation design should measure the concepts of fidelity (e.g., the extent to which the activities/curriculum were implemented), participant experience (e.g., satisfaction with activity/services), reach (e.g., the proportion of participants exposed to intervention activities); recruitment (e.g., the procedures used to recruit participants); context (any barriers or events that influenced the intervention).
- Present your process evaluation plan in a table with the following columns:
- Question (only one question per component)
- Data sources – may provide more than one
Example (JUST AN EXAMPLE, DO NOT USE THIS FOR YOUR ASSIGNMENT!):
To what extent was the curriculum implemented as planned?
- Observations of education sessions
- Program log books
- Key informant interviews
Score – based on checklist
Summary- description of the themes
Were all the participants satisfied with the education sessions?
- Questionnaires with participants
- Focus groups with participants
Themes identified; response frequencies summarized
What proportion of students participated in education sessions?
- Survey among students
- School roster
Response frequencies summarized; calculate proportion by # of students who attended sessions/ total number of students
What types of advertisements were used to recruit individuals?
- Key informant interviews;
- Program log books
- Surveys among participants
Themes identified; response frequencies summarized
What types of factors may have influenced the implementation of the project?
- Key informant interviews;
- Focus groups among participants
Part B: Description of Process Evaluation Design Strategy (one page)
- Since your key evaluation questions should be already presented in the matrix, this section of the assignment will be a description of the purpose of the evaluation and the study design to address your process evaluation questions.
- The design strategy should match the matrix, and should allow you to elaborate on the methods and data sources.
- Topics to cover for description:
- Purpose of process evaluation
- Methods to address key questions (i.e., how many focus groups and with who; how many key informant interviews and how will you identify them?); can be organized by each component
Assignment #4. Impact Evaluation
Purpose of this assignment: To test your understanding of issues related to study design, including threats validity, in relation to the evaluation of an actual field.
Topics to cover:
- Purpose of impact evaluation (in general terms)
- Objectives of the selected program (reiterate from previous assignments)
- Proposed study design
- Name of the study design and briefly describe how you will apply it, including details on “who, what, when, where, and why.”
- Threats to validity (how does the proposed design control for selection, testing, and history? Explain your answer with at least one sentence per “threat.”)
- What are the advantages and shortcomings of this design in terms of methodology, feasibility, and ethical concerns?
Word limit: 1000 words or less
- Select the strongest design possible given the circumstances of the program.
- Regarding threats to validity: use your limited space to address the three that are asked for: selection, testing, and history. Others may apply but you don’t need to discuss them in this assignment.
Disability Support ServicesIf you are a student with a documented disability who requires an academic accommodation, please contact Betty H. Addison in the Office of Student Life Services: firstname.lastname@example.org, 410-955-3034, or 2017 E. Monument Street.