Evaluation Meta WHH - Inception Report Final (14 06 2017)
Evaluation Meta WHH - Inception Report Final (14 06 2017)
Evaluation Meta WHH - Inception Report Final (14 06 2017)
to Welthungerhilfe
Independent Consultant
Bielefeld, 19.05.2017
Table of Contents
1. Introduction 3
1.1 Background 3
1.2 Purpose 3
1.3 Evaluation subject / scope 4
1.4 Feedback on the Terms of Reference 5
2. Methodology 5
2.1 Methodological concept 5
2.2 Methods 6
2.2.1 Desk review / text analysis 6
2.2.2 Content analysis 7
2.2.3 Mail survey 7
2.2.4 Interviews 7
2.2.5 Learning event 8
2.3 Methodological limitations 8
3 The work plan 9
4 Roles and responsibilities 9
Annex 1: Evaluation Matrix 11
Annex 2: Assessment Checklist Evaluation Report Quality 22
Annex 3: Individual Checklist Evaluation Report Quality 28
Annex 4: Questionnaire for Mail Survey 30
2
1. Introduction
1.1 Background
With the Welthungerhilfe decentralisation process, Country Offices (COs) are
strengthened and empowered to improve programme quality and intensify relations
with national stakeholders and donors on the ground.
Consequently, M&E responsibilities were shifted from Welthungerhilfe Head Office to
the COs. This resulted into an increasing number of project evaluations being
commissioned and managed directly by the COs. Head Office Monitoring, Evaluation,
Learning and Accountability Team (MELA) is to focus on the commissioning and
management of strategic evaluations to respond to information needs for decisions
making mainly for the Executive Director Programmes and the Board of Directors.
However, during a transitional phase (2015-2017), MELA continues to commission
project evaluations in exceptional cases.
With the ongoing decentralisation of the project evaluation processes, the roles and
responsibilities for the staff responsible for evaluation at Head Office (HO) level has
changed. Staff is inter alia supposed to develop standards for evaluations, to provide
supporting material and advisory services related to evaluations to COs and projects
and to monitor the quality of evaluations. This is to make sure that Welthungerhilfe
evaluations adhere to international standards for quality evaluations.
In 2016, Welthungerhilfe conducted a total of 30 project evaluations. Out of these 30,
22 were commissioned and managed by the COs (decentralised evaluations), while 8
were commissioned and managed by the MELA HO team (centralised evaluations).
Meta-evaluations, i.e. evaluations of the project evaluations, are carried out by
Welthungerhilfe since 2015 to gain insights into the quality of evaluation reports and to
identify re-occurring patterns in findings and recommendation. International evaluation
standards encourage meta-evaluations as an integrated element of a quality
evaluation system.
1.2 Purpose
The overall objective of this meta-evaluation and -analysis is to provide a learning
opportunity for the MELA team and MELA focal points in the COs to improve the
Welthungerhilfe evaluation system, instruments and practices.
There are several specific objectives of the evaluation and analysis:
To provide structured external feedback on the quality of evaluation reports as
well as the quality of the evaluation process.
To contribute to improving advisory services and information offered by the
MELA headquarters team to the MELA focal points and other project staff
involved in the commissioning and managing of evaluations.
To provide information on good practices in evaluation that will help MELA focal
points to improve their evaluation practices.
3
To document re-occurring evaluation findings and recommendations from
evaluation reports in order to facilitate organisational learning and strategic
decision making.
To reflect on opportunities and limitations of meta-evaluations and -analyses as
part of the Welthungerhilfe evaluation system. This will enable MELA to rethink
the existing evaluation system and its instruments with regard to the extent it
serves the information and learning needs of the organisation.
1
The German Evaluation Association DeGEval has defined four evaluation standards: Accuracy, Utility,
Feasibility and Fairness. For all four standards, there is a subset of criteria detailing and describing the criteria.
4
1.4 Feedback on the Terms of Reference
The ToR provide clear objectives for the meta-evaluation and -analysis. Overall, the
evaluation questions provide a sound basis for designing the evaluation process.
However, some evaluation questions cannot be answered with the available
information and with the limited resources of the evaluation. The evaluation matrix (see
annex 1) therefore gives some suggestions for adapting the ToR. As the ToR do not
address the question of gender in the evaluation, the evaluator suggests adding this
aspect to the analysis.
The objectives and questions cover a wide range of topics. This will require a broad
investigation approach and means that not all topics can be explored in great detail.
However, it will be possible to provide information on all topics and deliver all
deliverables stated.
2. Methodology
In the following, the proposed methodology is outlined, following the logical chain of
the evaluation process. After introducing the general underlying methodological
concept for the meta-evaluation and -analysis, the methods and major activities to be
used during the inception, exploratory and synthesis phases are presented.
An evaluation matrix detailing the methods to be used for answering the evaluation
questions is attached (see annex 1).
At the end of the meta-evaluation and -analysis, the evaluator will provide
Welthungerhilfe with all raw data that is not related to individuals (e.g. the matrix with
all recommendations grouped by thematic areas). This will enable Welthungerhilfe to
carry out further analyses on topics of specific interests which will not be covered in
the framework of this meta-evaluation and -analysis.
2.2 Methods
In phase 1 (inception) of the meta-evaluation and -analysis, the evaluator prepared for
the assignment by reviewing the existing documents, clarifying the ToR and discussing
different options for designing the process during a briefing meeting with the MELA HO
evaluation officers, and by elaborating the inception report that presents the approach
and methodology for the meta-evaluation and -analysis in more detail.
Phase 2 will focus on data collection and exploration. During this phase, the major data
collection tools – desk review / text analysis, content analysis, interviews, mail survey
- will be implemented.
In phase 3 (synthesis phase), the evaluator will triangulate and analyse the results of
the different assessments and bring them together in a comprehensive evaluation
report. The report presents findings, conclusions and recommendations. It contains as
an annex a management response. A feedback loop is foreseen until the final version
is accepted by Welthungerhilfe. The synthesis phase will also be used to prepare the
additional deliverables. “Good Evaluation Practices” will document good evaluation
practice examples from the 30 evaluation reports as an orientation for MELA focal
points. “Scenarios for a Meta-Evaluation/-Analysis” will present findings and
recommendations directed to the question of how meta-evaluations and -analysis
could complement the Welthungerhilfe evaluation system.
Phase 4 addresses the dissemination of results and the learning the meta-evaluation
and -analysis can provide for the organisation. It includes a debriefing meeting with
MELA HO staff and a learning event for Welthungerhilfe HO staff.
The major methods used are presented in more detail in the following sub-chapters.
2.2.4 Interviews
The more quantitative methods described above will be complemented by interviews
with selected relevant stakeholders (e.g. MELA team, MELA focal points, Country
Directors, Board of Directors / Executive Director, evaluators). Interviews are
particularly relevant for providing insights on the quality of the evaluation process and
on learning through evaluations at different levels. They will also be a major source
for the methodological reflection on the meta-evaluation and -analysis as part of the
Welthungerhilfe system. As stakeholder groups are very diverse, as well as the topics
they can respond to, the different questionnaires for guiding the semi-structured
interviews will be drafted during the evaluation process for each stakeholder group.
7
2.2.5 Learning event
To make sure that the results of the meta-evaluation and -analysis are used for
organisational learning, a learning event at Welthungerhilfe HO will be conducted. The
first part of the learning event will consist of a presentation and discussion of results.
The second part will be designed using elements of the Open Space Method. Open
Space is a creative method for working with larger groups. Participants will decide by
themselves which topics they want to continue discussing in small working groups.
8
3 The work plan
After the approval of the inception report, the evaluator will continue to implement the
workplan.
18 19 20 21 22 23 24 25 26 27 28
Desk review
Selected interviews
Draft report
9
Providing the contact details for the persons to be contacted during the
evaluation
Providing timely feedback on the deliverables
Informing the stakeholders about the meta-evaluation and invite them to
participate in the survey, the interviews and the learning event
Preparing the technical requirements and the facilitation materials for the
learning event
10
Annex 1: Evaluation Matrix
12
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
account the framework conditions (e.g.
resources for the evaluation, answering
the specific evaluation questions,
characteristics of the project or the target
group)
Accuracy 6b.) The report describes the data FoO: Methodology Text analysis Evaluation reports
sources, the rationale for their - The report describes the data sources Use of assessment
selection, and their limitations - The report describes the rationale for the checklist
selection of the data sources
It is recommended to rather assess if the
limitations of the chosen methods are discussed
(not the limitations of the sources)
Accuracy 6c.) The report includes a FoO: Methodology Text analysis Evaluation reports
discussion on how the mix of data - A triangulation of methods is applied Use of assessment
sources was used to obtain a (using different methods to obtain checklist
diversity of perspectives, ensure findings)
data accuracy and overcome data - A triangulation of data is applied (using
limits different sources, including the views of
different stakeholders)
Accuracy 6d.) The report describes the FoO: Methodology Text analysis Evaluation reports
sampling frame, rationale for - The report describes the sample frame Use of assessment
selection, mechanics of selection, - The report describes the rationale for checklist
numbers of selected potential selecting the sample
subjects, and limitations of the - The report describes the numbers of the
sample selected sample
- The report describes the limitations of
the sample
Accuracy 6e.) The report presents evidence This question is rather vague. To Text analysis Evaluation reports
that adequate measures were complement the sub-questions on evaluation Use of assessment
taken to ensure data quality, methodology, the evaluator proposes to use checklist
including evidence supporting the another question:
13
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
reliability and validity of data The report explains in how far existing data
collection tools (e.g. interview (e.g. from baselines, partners, the monitoring
protocols, observation tools, etc.) system) was included into the assessment
Accuracy 7) To what extent do the FoO: Methodology Text analysis Evaluation reports
evaluation reports give a complete - The report explains how the evaluation Use of assessment
description of stakeholder’s design addressed the participation of checklist
participation process in the stakeholders in the evaluation process
evaluation, including the rationale - The evaluation report states at least one
for selecting the particular level reason for the level of participation
and activities for participation? selected
Accuracy 8.) To what extent do findings FoO Findings / DAC Evaluation criteria: Text analysis Evaluation reports
respond directly to the evaluation This question is too broad and very difficult Use of assessment
criteria and questions detailed in to operationalise. The evaluator proposes to checklist
the objectives section of the report specify the question much more, also in
and are based on evidence order to be able to cover the different
derived from data collection and evaluation criteria which form a core part of
analysis methods described in the the assessment:
methodology section of the report? Relevance:
Specifically… - In the relevance chapter, the report
discusses to what extent the activities
and outputs of the programme are
consistent with the overall goal
It would be good to have a second and third
criterion on relevance, but the explanations
of what should be covered under relevance
are too different in the template explanation
of centralised and decentralised evaluations
to come up with a strong second criterion
that could be valid for both
Effectiveness:
14
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
- The evaluation report presents
findings on the extent to which
objectives are being achieved / have
been achieved
- The evaluation report refers to the
logframe indicators to assess the
effectiveness of the project or does
he/she explain, why the indicators
were not used
- The report explains which factors
influenced the achievement or non-
achievement of the objectives?
Efficiency:
- The report allocates at least for one
example costs to outputs
- The report discusses if outputs or
activities could have been
implemented with less resources
- The report discusses the advantages
and disadvantages of different
options for the use of resources or
explains why different options cannot
be considered
Impact
- The report reflects on the existence of
evidence for impact
- The report assesses the plausibility of
the project contributing to long-term
change
15
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
- The report reflects on unintended
(positive or negative) outcomes
Sustainability:
- The report reflects on the existence
and quality of exit strategies to
increase sustainability
- The report reflects on the likely
challenges for sustainability
- The report reflects on the likely
degree of sustainability for at least
two activities / outputs / outcomes
Accuracy 9.)To what extent do conclusions FoO: Quality of Triad Findings – Conclusions – Text analysis Evaluation reports
present reasonable judgements Recommendations Use of assessment
based on findings and - The report differentiates between checklist
substantiated by evidence, and findings/analysis, conclusions and
provide insights pertinent to the recommendations
purpose of the evaluation? - The report bases at least 80% of its
conclusions on findings
- At least two of the conclusions relate to
the objectives of the evaluation as stated
in the chapter on the evaluation purpose
Accuracy / 10.) To what extent do FoO: Recommendations Text analysis Evaluation reports
Utility recommendations clearly identify - The report states recommendations, i.e. Use of assessment CD, MELA focal
the target group for each it does provide advice on what should be checklist points, HoP
recommendation; clearly state done to improve project performance / Mail survey
priorities; are actionable and the achievement of objectives Interviews
reflect an understanding of the - There is a clear link of recommendations
commissioning organization and to findings / conclusions
potential constraints to follow-up? - The recommendations are targeted at
different actors
16
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
- The recommendations are SMART (i.e.
address specific actors, prioritised,
realistic)
- The number of recommendations is
adequate (not to broad and not to
detailed, adequate number to be
specified with MELA before using the
assessment checklist)
If recommendations are
- Feasible
- Reflect an understanding of the
commissioning organization and its
potential constraints to follow-up
Will be explored through mail survey and
interviews
Fairness 11.) To what extent do the reports Should be removed as this is impossible to Mail survey HoP, to a lesser
generally document the impartial judge from the reports. It could, however be Interviews degree CD and
and unbiased position of the rephrased and put under the process quality MELA focal points
evaluator? as:
Are the evaluated perceiving the evaluators
as impartial and unbiased?
Accuracy 12a.) Do the reports comply with FoO: Overall reporting quality / completeness Text analysis Evaluation reports
the Welthungerhilfe reporting As the assessment checklist has been Use of assessment
requirements as laid down in the developed to a high extent along the standard checklist
(standard) ToRs? ToR, the answer to this question will be through
adding the results of the specific questions
Classification of excellent / good / rather
satisfying, etc. reports will have to be
discussed with MELA
Process 12b.) Do the existing FoO: MELA support quality and requirements See b10 See b10
quality / Welthungerhilfe formats, templates This sub-question is almost identical with
Feasilbility and materials suffice to ensure b10 under process quality. Remove here and
17
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
good quality evaluation reports? add “What should be included in the future?”
What should be included in the to b10.
future?
Accuracy 13.) Are there major differences FoO: Comparison of centralised and Text analysis Evaluation reports
with regard to the reporting quality decentralised evaluations Use of assessment
between project evaluations checklist
commissioned by the MELA team Data comparison
and Country Offices? between centralised
and decentralised
evaluations
Accuracy 14.) What are the main influencing FoO: factors influencing reporting quality Mail Survey All stakeholders
factors on the reporting quality of It will not be possible to explore this question Interviews
evaluation reports? based on empirical evidence, as many factors
that could potentially influence the quality of
reports (e.g. experience and technical expertise
of evaluators, resources dedicated to the
evaluation) cannot be collected in the framework
of the meta-evaluation.
Therefore, the focus will be on exploring
opinions / impressions of stakeholders
b.) Process quality (Utility,
feasibility, fairness)
Fairness 1.)To what extent were obligations FoO: Clarity of division of tasks and Mail Survey MELA, HoP? Or
of the formal parties to the responsibilities Interviews CD? (clarify)
evaluations (what is to be done,
how, by whom, when) agreed to in
writing, so that these parties are
obliged to adhere to all conditions
of the agreement or to renegotiate
it?
Utility 2.) Did the evaluations ensure that FoO: Involvement of different stakeholders in the Mail Survey MELA, HoP, CD
interests of the persons or groups preparation of the evaluation and its design Interviews
18
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
involved in or affected by the
evaluation were identified, so that
their interests can be clarified and
taken into consideration when
designing the evaluation?
Utility 3.) To what extent did the FoO: Results orientation of the evaluation Mail Survey MELA, HoP, CD
evaluations ensure that the Interviews
evaluation is guided by both the
clarified purposes of the evaluation
and the information needs of its
intended users?
Utility 4.) To what extent were the FoO: Competences of evaluators (technically Mail Survey MELA, HoP, CD
person(s) conducting an and methodologically) Interviews
evaluation trustworthy as well as
methodologically and
professionally competent, so that
the evaluation findings achieve
maximum credibility and
acceptance?
Utility 5.) To what extent were the FoO: Timing of the evaluation (too late to take Mail Survey MELA, HoP, CD
evaluations initiated and into account for strategic decisions or design of Interviews
completed in a timely fashion, so successor project, too early to measure what
that its findings can inform pending needed to me measured)
decision and improvement
processes?
Feasibility 6.) Is the time dedicated to the FoO: Efficiency (enough time to carry out a Mail Survey MELA, HoP, CD
evaluations sufficient for the thorough and professional evaluation, adequate Interviews
evaluand? cost / benefit relation)
Utility 7.) To what extent were the FoO: Acceptance of evaluation findings / Mail Survey MELA, HoP, CD
evaluations planned, conducted willingness to utilize findings Interviews
and reported in ways that
encouraged the acceptance and
19
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
ultimately utilization of the
evaluation findings?
Fairness 8.) Did all stakeholders, to the FoO: Transparency / Dissemination of Mail Survey MELA, HoP, CD
extent possible, have access to evaluation results Interviews
the evaluation findings?
9.) Are there major differences FoO: Comparison of centralised and Data comparison MELA, HoP, CD
with regard to the process quality decentralised evaluations between centralised
between project evaluations and decentralised
commissioned by the MELA team evaluations
and Country Offices? (summary of results
from mail survey
and interviews)
10.) Do the existing FoO: Quality / completeness / utility of Mail Survey All stakeholders
Welthungerhilfe formats, supporting material and advisory services Interviews
templates, materials and advisory Here, complete: What should be included in
services suffice to ensure good the future?
quality evaluation processes?
11.) What are the main influencing FoO: factors influencing process quality Mail Survey All stakeholders
factors on the quality of the It will not be possible to explore this question Interviews
evaluation process? based on empirical evidence, as many factors
that could potentially influence the quality of the
process (extent to which all stakeholders had
access to findings, timeliness for decision
making) cannot be collected in the framework of
the meta-evaluation.
Therefore, the focus will be on exploring
opinions / impressions of stakeholders
c.) Meta-Analysis: Patterns of
re-occurring findings and
recommendations
12.) Which patterns of re-occurring FoO: Contents of recommendation, strategic Text analysis Evaluation reports
findings and recommendations learning / decision-making interests
20
Key areas Evaluation questions and sub- Assessment criteria / fields of observation Methods of data Sources of
questions (FoO) collection and information
analysis
could be relevant beyond the Content analysis
project-context and thus bear a (by Mayring
learning potential for method)
Welthungerhilfe as an organisation
and can serve as a basis for a
(strategic) decision making?
d.) Methodological Reflection:
Meta-Evaluation /-Analysis as
part of the Welthungerhilfe
evaluation system
13.) Can the meta-evaluation FoO: Résumé, suggestions / recommendations Reflection of meta- Evaluator, MELA
(conceptualized as is; with the for future meta-evaluations evaluation process HQ
current reporting quality and Critical discussion / reflection on the process and results
quantity) generate reliable data on and the results, potentials and limitations
trends and patterns that allow for
organisational learning and
(strategic) decision-making? If no,
which preconditions would be
needed to allow for data on trends
and patterns?
21
Annex 2: Assessment Checklist Evaluation Report Quality
(based on DeGEval standards, the Welthungerhilfe ToR of April 2017, old
Welthungerhilfe assessment checklist)
Section methodology:
Fully fulfilled / to a great
extent: 15-16
Mainly fulfilled:
13-14
Rather unsatisfying:
11-12
Unsatisfying:
< 11
2.1 An inception report 2.1.1 The report indicates that 2.1.1 yes: 1, no: 0
or minutes of the kick- an inception report / minutes of
off meeting replacing the kick-off meeting has Maximum points 1
the inception report documented the methodology to
outline the be applied
methodology to be
applied (additional)
2.2 The report 2.3.1 The report describes the 2.3.1 yes: 1, no: 0
describes the data data collection methods
collection methods and 2.3.2 The report describes the 2.3.2 yes: 1, no: 0
analysis, the rationale rationale for selecting the data
for selecting them (EQ collection methods selected 2.3.3 yes: 1, no: 0
6a) 2.3.3 It is plausible that the data
collection methods have been 2.3.4 yes: 1, no: 0
chosen taking into account the
framework conditions (e.g. Maximum points 4
resources for the evaluation,
answering the specific
evaluation questions,
characteristics of the project or
the target group)
23
Criterion / Quality Description / Definition Grading / Rating
indicator
2.3.4 The report describes the
limitations of the chosen
methods
2.3 The report 2.4.1 The report describes the
describes the data data sources 2.4.1 yes: 1, no: 0
sources and the 2.4.2 The report describes the
rationale for their rationale for the selection of the 2.4.2 yes: 1, no: 0
selection (EQ 6b) data sources
Maximum points 2
2.4 The report includes 2.5.1 The report applies a 2.5.1 yes: 1, no: 0
a discussion on how triangulation of methods (using
the mix of data sources at least two different methods to 2.5.2 yes: 1, no: 0
was used to obtain a obtain findings)
diversity of 2.5.2 The report applies a Maximum points 2
perspectives, ensure triangulation of data (using at
data accuracy and least two different sources and
overcome data limits including the views of different
(EQ 6c) stakeholder groups)
2.5 The report 2.6.1 The report describes the 2.6.1 yes: 1, no: 0
describes the sampling sample frame
for the evaluation 2.6.2 The report describes the 2.6.2 yes: 1, no: 0
process (EQ 6d) rationale for selecting the
sample 2.6.3 yes: 1, no: 0
2.6.3 The report describes the 2.6.4 yes: 1, no: 0
numbers of the selected sample
2.6.4 The report describes the Maximum points 4
limitations of the sample
2.6 The report explains 2.7.1 The report explains in how 2.7.1 yes: 1, no: 0
if the evaluation has far existing data (e.g. from
avoided duplications in baselines, partners, the Maximum points: 1
data collection by monitoring system) was included
relying as far as into the assessment
possible on existing
data
2.7 The report explains2.8.1 The report explains how 2.8.1 yes: 1, no: 0
if the evaluation was the evaluation design addressed
designed as a the participation of stakeholders 2.8.2 yes: 1, no: 0
participatory process in the evaluation process
(EQ 7) 2.8.2 The evaluation report Maximum points 2
states at least one reason for the
level of participation selected
2 Analysis along DAC criteria (EQ 8) Maximum points
achievable: 19
25
Criterion / Quality Description / Definition Grading / Rating
indicator
3.5.2 The report reflects on the 3.5.3 yes: 1, no: 0
likely challenges for
sustainability Maximum points 3
3.5.3 The report reflects on the
likely degree of sustainability for
at least two activities / outputs /
outcomes
3.6 The report bases 3.6.1 The report relates findings 3.6.1 yes: 1, no: 0
findings on evidence to evidence derived from data
collection and analysis methods
4 Quality of conclusions and recommendations Maximum points
achievable: 16
26
Criterion / Quality Description / Definition Grading / Rating
indicator
4.3.2 There is a clear link of
recommendations to findings /
conclusions
4.3.3 The recommendations are
targeted at different actors
4.3.4 The recommendations are
SMART (i.e. address specific
actors, prioritised, realistic)
4.3.5 The number of
recommendations is adequate
(not less than 5, not more than
20)
Total: Report Quality Maximum points
achievable: 60
27
Annex 3: Individual Checklist Evaluation Report Quality
Criterion / Quality indicator Grading / Comments
Rating
Project Title:
Project Number:
Evaluator(s):
Responsible for WHH:
Date:
2 Overall quality X of 9
1.1 The structure of the report is clear and
coherent
1.2 The executive summary is a stand-alone
section presenting the main information of the
evaluation
1.3 The length of the report is adequate to cover
the major aspects of the evaluation and at the
same time be economic to read
1.4 The report includes an assessment of how the
project addresses gender issues and how women
/ men benefit from project interventions
29
Annex 4: Questionnaire for Mail Survey
30
Type of evaluation:
Final evaluation: □
Mid-term evaluation: □
Other: □ If other, please specify:
Not sure: □
2.) The recommendations of the evaluation have been useful for the project.
I agree I rather agree I rather disagree I disagree
3.) The recommendations have been feasible (e.g. reflected a good understanding
of Welthungerhilfe and its potentials and constraints to follow-up on
recommendations).
I agree I rather agree I rather disagree I disagree
4.) Some of the recommendations could be useful beyond the project for
Welthungerhilfe as an organisation (e.g. for the strategy, for similar projects).
I agree I rather agree I rather disagree I disagree
31
Please, give reasons for your answer / specify:
6.) How did you use the recommendations? (Several answers possible)
2.) The evaluator(s) were technically competent for the evaluation topic.
I agree I rather agree I rather disagree I disagree
3.) The evaluator(s) were methodologically competent for carrying out the
evaluation.
I agree I rather agree I rather disagree I disagree
6.) The evaluation was carried out at the right time to fulfil its objectives.
I agree I rather agree I rather disagree I disagree
7.) Key stakeholders (e.g. partners, donors, target groups) were adequately involved
in preparing for the evaluation.
I agree I rather agree I rather disagree I disagree
Please, give reasons for your answer / specify (e.g. how, why not):
8.) Key stakeholders (e.g. partners, donors, target groups) were adequately involved
in the entire evaluation process.
I agree I rather agree I rather disagree I disagree
Please, give reasons for your answer / specify (e.g. how, why not):
9.) The Terms of Reference / the contract with the evaluator(s) clearly stated the
responsibilities and the obligations of all parties involved (who was to do what
and when).
I agree I rather agree I rather disagree I disagree
Please, give reasons for your answer / specify (e.g. how, why not):
10.) The support provided by headquarters and/or the country office throughout the
evaluation process was adequate.
I agree I rather agree I rather disagree I disagree
Please, give reasons for your answer / specify (e.g. how, why not):
11.) Welthungerhilfe formats, templates, materials and advisory services
have been helpful for promoting good quality evaluation processes?
33
I agree I rather agree I rather disagree I disagree
Please, give reasons for your answer / specify (e.g. how, why not):
12.) Is there anything you would like to see included in the format,
templates, materials and advisory services in the future?
13.) What did you do to provide access to the evaluation findings for the different
stakeholders? (e.g. dissemination of report to whom, inclusion of partners in
debriefing workshop)
14.) The evaluation was conducted in a way that encouraged acceptance and
utilisation of the evaluation findings
I agree I rather agree I rather disagree I disagree
15.) In your opinion, what are the main factors influencing the quality of an
evaluation?
Which recommendations would you give to your colleagues who are preparing for an
evaluation of their projects? What has to be taken into account when
34