ANALYSING EVALUATION RESULTS - Class Notes
ANALYSING EVALUATION RESULTS - Class Notes
ANALYSING EVALUATION RESULTS - Class Notes
The more often measurements are taken, the less guesswork there will be regarding what happened
between specific measurement intervals. More data points enable managers to track trends and
understand project, programmes and policy dynamics. The more the time that passes between
measurements, the greater the chance that events and changes in the system might happen that may
be missed. With passing of time, the attribution of causality may be lost. Questions as to whether
indicators get better or worse become difficult to answer. For example, it becomes difficult to tell
whether it was a straight line progression or a wave when a year has passed without taking
measurements.
Evaluation Goals
When analyzing data (whether from questionnaires, interviews, focus groups, or any other
instruments), it is necessary to start from review of evaluation goals, i.e., the reason for undertaking
the evaluation in the first place. The initial thoughts on evaluation results must be clear. When the
goals are known, then it is usually not possible to be surprised by the results. This helps organize
data and focus on the analysis. Below are some examples:
If the goal was to improve the programme by identifying its strengths and weaknesses, the
data can be organized into programme strengths, weaknesses and suggestions to improve
it.
If the goal was to fully understand how the programme works, the data could be organized
in the chronological order in which clients go through the programme.
If an outcomes-based evaluation is being conducted, data can be categorized according to
the indicators for each outcome.
Type of Analysis
There are many different types of analyses that vary in complexity. To help determine the type of
analysis to choose, the following can be considered:
The purpose of data cleaning is to determine whether the responses to questions and processes are
accurate and of required quality. Data related to specific project objectives will need to be
examined separately. Ideally, there should be more than one data source for each objective. This
helps to verify/validate the correctness of the data. Questions on whether any information is
missing, misrepresented or inconsistent have to be answered.
This is the analysis of numbers. Quantitative data are best presented in the form of figures, such
as graphs and charts. Sources of quantitative data include surveys, sign-in sheets, event forms,
interviews, census data; budget data, etc. Quantitative data analysis can consist of simple
calculations yielding factual information on attendance, usage, changes in performance, or changes
in knowledge or attitudes.
Descriptive Statistics
The process of quantitative data analysis can begin with descriptive analyses of data. These
analyses are called descriptive because they allow summarization of large amounts of information.
Descriptive statistics include frequencies (counts), percents, measures of central tendency (e.g.,
mean, median, and mode), measures of variability (e.g., range and standard deviation), and ranks.
In many cases, descriptive statistics will be sufficient to answer most stakeholders’ questions.
Inferential Statistics
After conducting descriptive analyses, more complex inferential analyses may also be conducted.
These analyses include testing for significant differences. For example, a test on whether
participants in the programme scored higher on an examination than individuals in a control group
can be done. It is also possible to conduct these analyses based on the gender of participants. For
example, a question like “Were female participants more likely to correctly answer the questions
than male participants?” can be answered by use of inferential statistics.
This is the analysis of words and pictures. Qualitative data are best presented as narrative forms.
Sources of qualitative data include observation notes, records, document review, content analysis
(e.g., of videos or youth media), interviews and focus group notes.
Qualitative data sets are typically large, complicated, and often messy to organize and analyze. In
qualitative analysis, there are fewer rules and standard procedures to guide the process than in
quantitative analysis. However, good qualitative analysis, like good quantitative analysis, must be
systematic.
The qualitative analysis most likely to be conducted for an evaluation is what is called content
analysis. Content analysis involves systematically analyzing the content of data, breaking it into
meaningful pieces, and organizing those pieces in a way that allows their characteristics and
meaning to be better understood.
Qualitative data analysis can include identifying themes in the data (a process called coding).
Themes can be framed around the key evaluation questions or other sources.
This analysis can also include creating a story from the data that uses descriptive details of
behaviours and selections of representative quotes from those who were interviewed.
1. Data reduction: This step involves selecting, focusing, condensing, and transforming data.
The process should be guided by thinking about which data best answer the evaluation
questions.
2. Data display: This involves creating an organized, compressed way of arranging data (such
as through a diagram, chart, matrix, or text). The display should help facilitate identifying
themes, patterns, and connections that help answer the evaluation questions. This step
usually involves coding, where passages of text (or parts of images, etc.) that have the same
message or are connected in some way are marked, and an accompanying explanation of
what the selected passages have in common written.
3. Drawing Conclusions and verification: During this last step, revisit the data many times to
verify, test, or confirm the themes and patterns you have identified.
Qualitative data analysis can include identifying themes in the data (a process called coding).
Themes can be framed around the key evaluation questions or other sources.
Qualitative analysis is a cyclical and iterative process, with many rounds of investigating evidence,
modifying hypotheses, and revisiting the data from a new light. Data will need to be re-examined
repeatedly as new questions and connections emerge and as a more thorough understanding of the
information collected is gained.
Throughout the process of examining and re-examining data, the following should be taken into
consideration:
To what extent patterns suggest that additional data may need to be collected
Interpreting Results
Interpreting the data can be done in the context of answering the following questions:
What implications do the results have for identifying how the program can improve?
When findings are prepared, positive and unexpected or negative results should also be included.
Positive results tell you where your program’s strengths are, motivate staff and other program
stakeholders, and identify program areas that might be expanded.
Unexpected or negative results are crucial to framing recommendations and modifying practices.
They can also be part of an argument for expanded funding or programming (e.g., increases in
staffing or expansion of facilities).
Are the goals for data analysis realistic, given the resources available to the programme
by way of budget and staff commitments?
Are all possible data sources being drawn upon in order to develop findings?
Is there an effort to identify programme weaknesses and strengths?
Are findings and recommendations framed in such a way that they can be useful for
programme improvement?
Have efforts been made to involve programme stakeholders?
REPORTING EVALUATION RESULTS
Introduction
Evaluation findings have to be reported in a way that will be useful to programme stakeholders,
both primary and secondary. The following will need to be taken into consideration when
preparing the evaluation report:
1. The level and scope of content depends on whom the report is intended for, e.g., to clients,
sponsors (or financiers), employees, customers, the public, etc.
2. Be sure the project team have a chance to carefully review and discuss the report before
releasing it out to external parties. Translate recommendations to action plans, including
who is going to do what and by when.
It is important for the evaluator to deliberately identify those who can make use of the evaluation
results. This includes people in supervision team, administration, other programme participants,
advisory groups and sponsors of the programme, both private and public. It should be noted that
members of the public are important stakeholders if the programme is supported by public funds.
The civil society, lobbyists and aides to public policy and decision makers are also key people to
keep informed of the results.
The evaluation report should contain an executive summary; description of the organization and
the project/programme under evaluation; explanation of the evaluation goals, data collection
methods, and analysis procedures; listing of conclusions and recommendations; and relevant
attachments, e.g., inclusion of evaluation questionnaires, interview guides, etc. In particular, the
project evaluation report should detail:
The following are major points to be considered in writing the evaluative report:
(iv) Method
This might include: population studied; the survey instrument; data collection procedures; and data
analysis. Usually this is a brief section. It could also include evaluation design.
(v) Results
Display and discuss what you found. You may want to include "typical" quotes from respondents.
Put most tables in the appendix so that the report is not clogged with information.
(vi) Conclusions
Based upon the evidence that was collected, what conclusions can you draw from the data? This
section contains the evaluator’s judgments about the results of the programme. The results should
address the concerns that were to be addressed as per the identified problem. Focus on objectives
of the evaluation. Deal with unanticipated results if they are significant. Evidence must be in the
report to support the conclusions. It is important to let the team members to read and critique
conclusions, i.e. colleagues involved in the evaluation should agree on the conclusions.
Evaluation reports are generally communicated either orally or in some written report. Whichever
method you choose to use, there are several factors that should be considered before the report is
prepared. Owing to limitations of time and other resources, the report should be precise and must
not include irrelevances. The conclusions and recommendations should allow critical decision
makers to be able to decide without having to rely on other references. Therefore, it is important
to have a good understanding of reporting.
General guidelines that should be considered when reporting evaluation results are indicated
below:
(i) Audience
There is need to learn as much as possible about the audience. When reporting, it is
important to assume that the reader does not have the background of the project, thus the
need to briefly describe it within the report. The education level of the audience as well as
their cultural background of those interested in the report are also important. Further, the
audience’s professional background or occupation is a key to enhancing understandability
of the report.
(ii) Credibility
Data should be objective. Data should tell what was found.
(iii)Presentation
It is necessary to use good speaking and writing skills. The report should be organized in a
format that is easy to follow.
(iv) Data
Present data in a way that the audience can understand what was found.
Reporting with formal documents is the most widely used means to communicate evaluation
results. Other methods of communicating evaluation results in writing include the use of
newspapers and magazines. For example, one might utilize a newspaper column to communicate
to the general public the success of a local programme. An exhibit using posters, with written
information and pictures revealing before and after conditions, is also an excellent way to report
evaluation results.
When you report evaluation results orally, "do what comes naturally". Be at ease. This implies that
you know your audience. Practice your presentation and make it interesting by doing a variety of
things other than merely talking. For example, use slides, transparencies, and role playing. Get
your audience involved by letting them ask questions and predict results. Oral reporting may be
either face-to-face or presented on radio and TV. Regardless of where the oral report is presented,
be certain you are prepared. The following suggestions are presented regarding the oral report.