Eu TPM
Eu TPM
Eu TPM
August 2020
1
ICE on behalf of the European Commission
Tel: +32.2.792.49.05
Fax: +32.2.792.49.06
www.ice-org.eu
The content of this publication is the sole responsibility of ICE EEIG and can in no way be
taken to reflect the views of the European Union
2
Good practice in conducting Third
Party Monitoring in conflict settings
A research-based handbook for donors and practitioners
3
Table of Contents
Abbreviations .......................................................................................................................................... 6
2 Context .......................................................................................................................................... 10
4
6 Implementation phase .................................................................................................................. 31
ANNEXES................................................................................................................................................ 44
5
Abbreviations
6
1 Executive Summary
1.1 Main findings
On one hand, Third Party Monitoring (TPM) can be described in simple terms as the conducting of
monitoring by a third party – i.e. neither the donor not implementer. Alternatively, it can be seen as
a complex, fast-growing field characterised by a wide variety of forms, all shaped by highly diverse
donor needs, reflecting myriad local contexts. That said, it can still be distilled into two main types –
the monitoring of the views of people, which is seen as more commonplace than the monitoring of
assets. TPM is not only used by donors; it is used by some larger INGOs in high-risk areas where they
themselves lack access.
Two main models of TPM are explained in the report. The ‘conventional model’ sees an international
TPM implementer working as the main interlocutor with the donor. A second ‘alternative model’
sees donors work directly with local organisations. Those who have been involved in various TPM
projects tend to believe that while it is possible to think of a ‘core’ methodology, each TPM
programme’s design tends to be quite unique, because of the diverse requirements (technical, social,
political) placed on the monitoring.
Donor’s motivations for undertaking TPM centre around a lack of access, which is underpinned by a
desire (i) to be accountable (to communities and taxpayers) (ii) to optimise performance and (iii) to
mitigate financial and other risk. There is a school of thought that it should be used only in this ‘last
resort’ sense.
There are clear benefits concerning TPM that go beyond the providing access, in part because TPM
contracts tend to incentivise the private sector to demonstrate considerable levels of innovation,
also through technology. The main benefits are considered to include: the power of high calibre and
independent data, the reduction of project and fiduciary risk, and the ability of TPM to enable
improvement in programming design and refinement. A certain degree of scepticism exists about
TPM, which is driven by the cost, and also by concern around the risk of disempowering of project
managers.
Donors and TPM implementers tend to believe that they enjoy good relations, but face a range of
difficult challenges. The core challenge is that implementing partners (IPs) will naturally be alert to
the accuracy of the TPM findings since (i) they will see it as potentially harming their prospects of
future funding and (ii) TPM implementers are unlikely to have the same level of understanding of the
context and challenges that are faced by the implementing partner in the area visited.
There is a sense that TPM is on the cusp of an acceleration in its evolution through two specific
forms of technology. Respondents noted that aerial imaging data (through satellites and drones) and
‘big data’ could expand the range of data insights that TPM provides while reducing costs, but that
these methods are only now crystallising as users test and validate their utility. Aerial imagery in
particular is expected to provide a lens on visible phenomena on the ground that can be taken as
proxies for behaviours within communities. Dashboards are fairly widely used in addition to
reporting methods, but have seen mixed results in terms of their usefulness.
While most donors acknowledge and appreciate the quality of insights that commercial TPM
providers can bring, they remain highly sensitive to cost, and perceive that TPM implementers can
improve in thinking creatively around this issue.
7
1.2 Recommendations
General advice to donors
Prepare to play an active role in driving such contracts, ensuring that there is frequent
interaction and a level of trust between all actors.
Strive for mutual benefit from the data – meaning putting more focus on helping IPs to feel
they receive benefit, encouraging TPM implementers to prioritise the needs of the IPs as
much as the needs of the client.
Favour up-front-planning including maximal engagement with TPM implementers before
issuing ToRs, including ‘Early market Engagement’ (EME), if in line with the donor’s
procurement policies.
Choose TPM implementers who have the attitude, chemistry, and local connectivity to
succeed, taking the time to understand as well as possible the internal dynamics and
collective experience of the team under consideration, their interpersonal skills and vision,
and being wary of TPM implementers referred to as ‘body-shops’ who offer a collection of
individual experts, but who may not be well set up to work well together effectively.
Create exemplar reports (i.e., the deliverable that will be expected of the TPM implementer)
before locking down a Terms of Reference (ToR), as this has been found to be extremely
useful in enabling donors to think through in detail what they do and do not need, so that
this can increase the utility of the ToR.
Donors are often obliged to operate in especially fragile and/or sensitive contexts. This raises a
number of challenges, including necessitating operating with a variety of atypical interlocutors,
ranging from UN peacekeeping operations, to niche state security actors (such as intelligence
services), senior and/or niche members of military and police institutions, Civil Society Organisations
(CSOs), politically motivated groups, and armed Non-State Actors (NSAs). Such organisations have
very unique internal dynamics (DCAF, 2015) and may have little or no incentive, motivation or
capacity to be involved in the donor’s TPM/MEL requirements. In such settings, care is taken by
donors to adapt to the circumstances, including by: being alert to the hindrance of the delivery of
aid, pushing for compliance with humanitarian norms, remaining conscious of unique risks (such as
safety of partners), and the heightened risk of misappropriation of funds. In such settings donors are
advised to:
Be realistic in terms of TPM expectations: Since motivation or capacity to engage with TPM
can be low among some of these actors, donors have to be willing to lower their
expectations. This can mean (i) choosing a narrower range or smaller number of indicators
for the partner to follow – as few as one indicator or measure or (ii) putting a greater focus
on simple, categorical output indicators over complex or qualitative outcome indicators.
Push for buy-in and access, at a high level: Donors may face resistance in deploying the
standard MEL techniques, or access may be denied to locations or respondents. In such
cases, donors may want to place TPM on the agenda at high-level discussions, perhaps to
ensure that senior leadership commits to a certain basic level of collaboration from the
outset. While such partners may be unaccustomed to TPM, MEL or even evidence gathering,
this does not preclude the possibility that they might be persuaded to play a role. It may be
worth asking them to attend and/or become involved in fieldwork (plausibly to provide
logistical support), and/or to play a role in the dissemination of results. It may also be worth
seeking ‘champions’ within the organisation upon whom success might hinge.
8
Change the language: Terms such as ‘TPM’ or ‘MEL’ may work well in donor context, but
they may be unappealing to other kinds of actor. Instead, more basic language might be used
such as ‘evidence’, ‘proof’.
Choose the right representative: Some such actors will listen more closely to personnel on
the donor side whose background mirrors their own. It can be prudent for donors to ensure,
for example, that senior donor military personnel make requests around TPM to senior
military partner personnel.
Consider aerial technology: Such fragile areas are the environments in which technology
seems to blossom for TPM. Satellite imagery can be used in numerous ways as a proxy for
indicators that might otherwise be captured by monitors. For example one donor is using
satellite imagery to monitor access to a hospital in such a setting, taking this as a proxy
indicator for the flow of its aid to that hospital.
Donors and IPs undertaking shorter-term interventions face a unique set of challenges in terms of
MEL, including that outcomes may only be measurable after the intervention in question has ended.
In these situations, donors should react to this by:
Increasing the use of qualitative indicators: By its nature qualitative research is better-
placed to pick up on early reaction to an intervention; although the smaller sample sizes used
mean a lack of robustness relative to quantitative methods, the richer, more insightful data,
and ability to use specific sampling, allow for the production of earlier glimpses of real effect.
Budgeting to return: Donors are encouraged to consider planning to return to certain
locations for one or more subsequent visits in order to undertake further monitoring. Such
visit may seek to (i) evidence if adequate progress has been made (where poor performance
or difficult conditions were previously noted) and/or (ii) identify outcomes and/or impacts
that are attributable to the intervention. Such visits might be conducted by the Third Party
Monitor or, if access has become viable for the donor, by the donor’s own programme
management personnel.
Donors attempting to realise political benefits face the challenges of trying to measure something
that is of vital importance, yet somehow ethereal and often under-appreciated by the TPM
implementers and implementing partners alike. In these settings donors are encouraged to:
Be overt about these ambitions, communicate why this is important to all other actors for
example through workshops, and explain that politics equates to power dynamics, and so is
not an ‘ugly’ issue to monitor
Push the TPM implementers to adapt MEL tools to embrace this issue i.e. build the political
aspect into Theory of Change and logframes.
Cost remains a key barrier to donors. TPM implementers should dare to push for a clear
budget, but then strive to offer a range of costs that do not only hit the ceiling budget but
offer lower-cost options, and above all make very clear the value and reasoning behind
different options. Ensuring clarity on what is not included in a proposal is also key, so as to
avoid avoidable frustration during the inception phase.
9
Mind-set – a culture shift is needed by TPM implementers. For these contracts to work, TPM
implementers must consider implementing partners a ‘client’ as important as the donors.
TPM implementers must acknowledge that finally the donor-implementing partners is
usually a more profound and long-term relationship than the donor-TPM implementer, as
there are only a limited number of UN bodies and INGOs.
Team cohesion: TPM implementers must put more effort on choose teams that are
genuinely cohesive. Various methods may work, such as choosing Team Leaders first, and
encouraging them to develop a vision and build teams from people whom they know will
work well together towards that vision.
2 Context
2.1 Background of this paper
Development, stabilisation and humanitarian workers professionals operate in increasingly conflict-
affected environments. This places limits on donors’ ability to safely visit projects, meaning that
visibility of projects in many vital locations is diminishing. As a result,
the importance of Third Party Monitoring (TPM) is growing (Herbert, Other key
2013), as donors require confidence that their investments are
producing the desired outputs, yielding the desired outcomes and reading on
impact, and adhering to humanitarian norms. The Unit in the EU
Commission’s Service for Foreign Policy Instruments (FPI) responsible TPM (click to view)
for the management of the Instrument contributing to Stability and
Peace (IcSP) commissioned this handbook to add to the body of EU Commission Results
knowledge on this subject, and to help those who need to rely on Orientated Monitoring
Handbook
TPM to understand the reality of the field, and to engage in it with
confidence. DFID Monitoring in Remote
Areas
2.2 Objectives of this paper
USAID TPM document
The principal strategic aim of this paper is to help donors who lead (available upon request)
TPM to do so more efficiently and effectively. TPM involves a range of
Third Party Monitoring in
actors working in highly challenging settings, and this naturally places
Volatile Environments - Do
considerable pressure on those involved and on the processes they the Benefits Outweigh the
use to collaborate. FPI sought to facilitate this process by bringing to Risks?
light the principal issues, challenges and constraints faced by all those
who are involved in TPM. Instruction note for ECHO
staff on Remote Management
The specific aims of this paper are to:
Listening to communities in
Analyse what constitutes ‘good’ or ‘best’ practice in TPM, as insecure environments
well as identifying practices to avoid
Technologies for monitoring
Investigate how to identify, procure and work successfully in insecure environments
with a TPM implementer
Understand how TPM Implementers operate – enabling Between a rock and a hard
donors to work with TPM providers in a manner that is as place: Monitoring aid
convenient as possible to TPM implementers implementation in situations
Assess what practices can and should be put in place that will of conflict
facilitate a healthy cooperation between TPM implementers
and Implementing Partners (IPs)
Risk and technical issues connected to this practice of TPM
10
Explore how to build trust and further the prospects for long term collaboration between
actors
Highlight the current and potential role that technology plays in TPM.
2.4 Methodology
There were two components to this research; one primary and one secondary. The first phase was
the desk research – the results of which are summarised in Annex B. This phase sought to understand
what research had already been done on this subject.
The second phase was primary research, which took the form of qualitative Key Informants
Interviews (KIIs). A total of 35 interviews were undertaken, which are categorised in the following
table1:
2.5 Limitations
Two principal limitations to this research are noted:
A number of IPs were invited by the author to take part in this research, however none was
able to do so. This is a limitation since it is logical that the opinions of those whose
organisations are scrutinised by TPM implementers are a key part of the full story of how
best TPM should be undertaken. In an attempt to offset this, TPM implementers who were
interviewed were asked to provide their thoughts on the views of IPs, at least as they see
them. While TPM implementers were willing and able to do this, there is no reason to
believe that the most important views of IPs on this matter are well expressed in this report.
This introduces a risk that this report does not adequately speak to the views of those who
are in the field, and their experience of being monitored in this way. Further research in this
area would therefore be welcomed.
Owing to COVID 19, which took hold at the same time as fieldwork began, all interviews were
undertaken remotely. Had this not been the case, IcSP may have opted to hold one or more
in-person meetings and/or workshops in Brussels and/or other locations where large
numbers of stakeholders are present.
1
I still hope to interview: Danish MOFA and at least one IP, so this could still go up.
11
2.6 Notes to the reader
In order to make the report easy to read, the authors have adopted the following key
acronyms for the different actors in the process:
o Implementing Partners ‘IPs’
o Lead Agencies ‘LAs’
o Data Collectors ‘DCs’
This paper uses the term MEL which may be known by other terms (M&E – ‘Monitoring &
Evaluation’, AMEL ‘Accountability, Monitoring, Evaluation and Learning’ and DMEL ‘Design,
Monitoring, Evaluation and Learning’). These terms are not synonymous but for the purpose
of this report are effectively interchangeable with the term MEL.
All the quotations in this report were provided by the interviewees.
12
“TPM is most successful when
you find a way to make the
information it creates
genuinely useful to all parties”
Academic
3 Considering undertaking TPM
TPM is not as complicated as its name might suggest. Simply, it is
MEL done by a third party. However, there are many forms of TPM
and you should situate yourself and your intervention in the broad
spectrum of TPM before starting to consider taking action.
3.1 Why do TPM?
At the most basic level, the principal reason why a donor may want
to undertake TPM is a simple lack of access (Sagmeister et al, “Before writing
2016). In its guidance note on the subject, USAID for example the ToRs, sit
writes:
down and
‘Third-party monitors are contracted by USAID to act as our
eyes and ears when we cannot ourselves access activities’. discuss as a
Interviews explored the reasons why donors have initiated TPM team for hours
project – beyond this simple starting point of access. The main
reason given are summarised and shown in Figure 1. While they
what you really
may be seen as interconnected, some explanation of each is want. The rest
provided:
all flows from
Risk: All development and humanitarian projects carry risk, typically
managed through a risk register. The kinds of risks that are
there”.
addressed by TPM include inadequate design and implementation, Donor
misappropriation, unintended consequences, and conflict
sensitivity. To the Project Manager and those focused on risk management, TPM can become the
principle tool to help mitigate risk (Kelly et al, 2017).
Finance: Invariably, the budgets of programmes considered for TPM are significant, and donors will
be keen to be sure that they are delivering reasonable value for money. With limited or no visibility,
TPM is sometimes undertaken in order to provide a more robust assessment of the financial delivery
13
of the project. This may extend into the undertaking of VFM analysis, a specific field of MEL, which
assesses the ‘4Es’ of economy, efficiency, effectiveness, and equity.
Accountability: Donors strive to be accountable to communities and taxpayers (Chaudri et al, 2017).
IPs invariably have their own MEL systems, but funding and access to qualified local staff for these
can be limited, and the donor cannot easily rely on these IP systems delivering perfectly accurate
insights. If then a donor has neither access nor TPM, and rests only on the data from an IP, this
obligation to communities and taxpayers is placed at some risk. Independently produced and robust
evidence is then helpful in asserting the actual impact of an intervention. It was also emphasised that
it is good practice to link such data to IPs own MEL systems.
Performance: Donors work with IPs to help them to meet the outputs and outcomes committed to in
their proposals. The oversight of these projects is challenging for the IP as well as the donor, and the
contexts in which TPM is undertaken are typically more challenging than a typical project. Donors
and IPs have a shared interest in objectively understanding progress, so that any necessary
improvements can be taken.
These four needs are of markedly different importance to different contexts, and, as a result, TPM
interventions take very different shapes (van Beijnum et al, 2018). This makes it challenging for
donors to create ‘libraries’ of templates and tools for the various TPM projects that they manage.
This diversity may also contribute to the difficulty that donors face in collectively understanding and
undertaking TPM.
14
IPs perceive that different donors have different ways of setting about TPM, and that this reflects the
differing context and pressures that each faces. Some donors favour a more scientific and methodical
approach, others attach more weight agility and flexibility. Practitioners see arguments for both, and
favour a case-by-case consideration of the right approach for each TPM programme.
USAID: DFID:
Third Party Monitoring (TPM) is the ‘The practice of contracting a third party
systematic and intentional collection of (neither a donor nor implementer) to
performance monitoring and/or collect or verify monitoring data. It is
contextual data by a partner that is not increasingly used to overcome the
USAID or an implementing partner challenges of monitoring in remote or
directly involved in the work. restrictive environments’.
For the purpose of this paper, and leaning on the inputs summarised in Figure 1, we will use the
following definition, which combines aspects of the above and other thoughts provided by the
experts interviewed:
‘TPM is the use of an independent organisation – typically in areas where a funder does
not have access - to collect monitoring data in order to reduce risk and maximise
performance and accountability’.
15
PUTTING TPM INTO CONTEXT
How to ‘understand’ TPM in the context of …..?
vs. MEL generally? vs. Evaluation?
MEL is a broader term that describes any TPM is essentially a process of monitoring,
monitoring, evaluation or learning undertaken not evaluation. The key distinction is that
by any actor. monitoring (and so TPM) is designed to be
Therefore, we might say that TPM is a subset of on-going.
MEL. That said, the two do in practice overlap;
TPM is MEL which is undertaken neither by the this is seen as inevitable and healthy.
donor nor the implementer, and so the Evaluation is formative and so tends to
independence of the data is a distinguishing draw ‘summative’ conclusions over a longer
factor. That is not to say that TPM data is period of time. Monitoring meanwhile aims
necessarily ‘better’ – IPs will usually know their to be ‘formative’ i.e. to help to form or
environments better, and it is the responsibility shape a programme while it is happening.
of the TPM provider to ensure that it has ample Evaluation is normally less resource-intense
contextual understanding. and so less costly but only offers feedback
Learning could be described as reflection which from one moment in time, and – in the case
builds institutional memory and so facilitates of end-line evaluations (the more common
better decision making. Donors increasingly type) too late to adjust a project to
expect IPs to have Learning systems. TPM then enhance performance.
can and should feed into overall Learning. TPM and evaluation however are
compatible – for large
portfolios/programmes it is considered
normal to do both TPM and evaluation.
“Monitoring should be separate from evaluation because the skill sets are very different; if you bundle
them together, it means that compromises will need to be made” Implementer
Donors and TPM implementers who were interviewed agreed that the principle drivers to using TPM
is access, i.e. that risks of physical safety and/or travel restrictions (linked to security) often create a
situation where conventional monitoring is not possible. While access is considered the most obvious
and often the fundamental reason to use TPM, it is not the only reason, according to those
interviews. Respondents also put forward the views that TPM should be considered when:
16
The IP has expressed difficulty in undertaking MEL or TPM
Other niche scenarios – such as threats to the supply chain – e.g. (specific to medicines),
where there is reason to believe that the ‘cold chain’ may be hard to maintain.
Benefits of TPM
In addition to the visibility afforded by TPM where access is otherwise not possible, respondents
pointed to the following core additional benefits of TPM.
17
When the real need can be solved in another way – donors were able to give examples of
when the conventional form of TPM - i.e. sending monitors into the field, can be addressed
in other ways, such as the use of technology. An example was given of using satellites to
show traffic around hospitals to evidence the functionality of a funded facility.
18
3.4 General advice on leading TPM
Overarching advice to those commissioning and working on TPM is shown below.
People: TPM projects more commonly set out to engage with ‘people’ through qualitative and
quantitative research with:
19
Indirect beneficiaries – those who benefit indirectly
Programme management – the key staff from the implementing partner
Other stakeholders (such as community leaders, thematic experts) through ‘Key Informant
Interviews’.
TPM practitioners consider that the use of TPM to monitor the opinions of people is significantly
more common than the use of TPM to monitor assets. For this reason the remainder of this report
defaults to discussion of TPM of people, with reference made to assets where applicable.
Assets: In the sorts of fragile and conflict-affected places where access may be impossible and so
TPM more likely to be useful, medium value Non-Food Items (NFIs) or high-value infrastructural
assets may be distributed, and the donor may want to be confident that these remain in the
possession of the intended party/ies. In addition to tangible assets, TPM can set out to look at
intangible assets such as the systems and processes of the IP itself. One relatively common form of
TPM is to assess the strength of IPs’ MEL systems – the logic being that if the donor can be confident
that the IP has the ability to report data well, then there is less need for a high-cost, long term
version of TPM.
20
Experts (often consultants who may be Figure 4: The ‘International’ model
expert in writing, or thematic areas)
A Project Manager
A Project Coordinator or Officer
Cross-cutting experts (such as on Social
Inclusion and/or Gender / vulnerability,
Finance, Conflict Sensitivity).
21
here to contribute to the localization and participatory development agenda.
Respondents were able to compare and contrast the relative benefits of both these approaches, and
these perceptions are summarised in the below table:
Pros Cons
International model Depth of experience, and so High cost
credibility Risk of insights from the field being
Calibre of work, especially writing lost during report writing
and insightfulness
Ability to work internationally
Local model Lower cost, even higher value Weaker reporting skills
Local insights (less filtering of the More risk to donor of being pulled
data) into data processing
Need to ensure financial viability
TPM Implementers were also asked to self-critique themselves, to reflect on where they feel they
can improve in terms of their service provision to donors.
Focussing on monitors and their wellbeing; giving them a clearer stake in the process
Shifting to see the IP as just as key a ‘client’ as the donor, adopting more of a coaching attitude,
and avoiding coming across as patronising
Playing an active role in shaping the ToRs
Working with donors to co-imagine the end deliverable at the beginning of the process
Helping donors to make the most of Inception phase
TPM Implementers were keen to note that their work is tightly interlaced with the normal work of
MEL and the MEL systems of IPs. They are invariably closely scrutinising logframes and Theories of
22
Change. For this reasons, overleaf, the reader will find a 1-page refresher on Monitoring, Evaluation
and Learning, in order to put issues into context.
The field of ‘Monitoring, Evaluation and Learning’ is not without its complexities, but it may still be
paraphrased to a simple question: ‘how is our intervention doing’? In essence, MEL is a set of
perspectives or tools that development and humanitarian actors have developed and refined over a
period of decades to answer this simple question. The field of MEL is constantly evolving; in the 70s
and 80s many spoke of ‘evaluation’ only, then monitoring took shape - underlining the importance of
an ongoing understanding of an intervention’s progress. More recently ‘Learning’ has emerged. Here
is a brief description of what these three inter-linked elements are, how they differ from each other:
Others feel that accountability is another aspect of the work; i.e. that those undertaking MEL are well
placed to then use that information to ensure that an organisation to remain accountable to its
beneficiaries and taxpayers. Hence, the alternative acronym, ‘MEAL’ that you might be aware of.
Most donors use ‘Theories of Change’ and ‘Logframes’ to underpin their MEL work. The former
present the ‘vision’ of how the intervention will meets its aims, by considering (i) the problems faced
(ii) the outcomes desired, (iii) the outputs and activities that will be needed to bring about this
change. Logframes then detail these elements very specifically, by converting the intended aims into
‘indicators’ which, using ‘SMART’ rules, and by stipulating targets and the ‘means of verification’,
finally provide a solid process which is constantly maintained through each financial year.
Inputs: The resources to hand to undertake activities (principally: funds and time)
Activity: Work done to produce each output (some may serve multiple outputs)
Output: The ‘deliverable’ (a product or service, such as a training session or a document)
Outcome: The short-to medium-term change intended (also thought of as ‘behaviour
change’) – which the intervention aims to play a significant role in realising.
Impact: The long-term change intended – acknowledging that other interventions / social
phenomena will play a role in achieving these
23
4 Tendering and awarding a contract
To the extent that tendering procedures allow, TPM implementers
should be brought into the process as early and as closely as possible,
as it helps them to put forward methodologies that will work and be
affordable.
4.1 Criteria for selecting a TPM partner
Donors and TPM implementers were asked to say what they think are the key criteria for choosing a
TPM implementer. Responses are divided into ‘conventional and ‘emergent’.
24
o Too often, the objectives are insufficiently thought-through. Implementers are keen to get
under the skin of donors’ real needs, but often sense that there is ‘hidden meaning’ beneath
donors’ stated ‘objectives’. On the one hand the word ‘objectives’ itself has different
underpinnings (‘drivers’, ‘triggers’, ‘influences’, ‘impulses’, ‘rationale’ etc.). On the other, it
can be possible to think in terms of short vs. long-term thinking, and strategic vs. tactical
thinking. TPM implementers want donors to express the fullness of their intention, to go into
as much detail as possible. One simple solution is to split the objectives into ‘strategic’
objectives and ‘technical’ objectives. The box below demonstrates the different kinds of
objective.
Non-objectives
o It can be very useful and revealing to TPM implementers to Caution around
know what a donor does not want to know, because they
feel they already know it. For example, it could be that the
Conflicts of
donor feels there is no need to explore for example the Interest (COI)
sustainability of a project, or its conflict sensitivity, if this
information is gathered in other ways. Some of the large
Criteria international firms act
o Most donors will explain the % of importance that is both as IPs and third party
applied to each aspect of the proposal, of each of the TPM Implementers. This
technical and financial sections. If a little detail can be given can create a Conflict of
on the reasoning behind this allocation of weighting, this Interest, and you should be
will be appreciated by TPM implementers. sure to include in your
selection process a
Conflict of Interest (COI)
thorough understanding of
o The ToR should instructs the respondent to be clear on
whether the candidate
which projects if there is any, they are already undertaking
organisations undertake
– either as a TPM implementer or as a delivery
implementation in the area
implementer, in the relevant country / region.
in question, or have any
Issues open to debate
other links that make them
o It is tempting for a donor to write a ToR in such a way as to
inappropriate for the work.
give the impression that everything about the intended
TPM is clear in terms of how it should be done. However, TPM implementers appreciate that
these projects are complex, and it is very hard to be clear on all matters at the point of
25
writing the ToRs. They would welcome some indication of where there is uncertainty, or
whether there is flexibility to be creative.
A key finding from this research is that TPM implementers very much welcome opportunities to
interact directly with donors on the intended work, even to be part of the discussion underpinning
the creation of the ToRs, either in person or remotely. A consensus was found among TPM
implementers for a preference for an invitation to a session at which the project can be discussed
openly – potentially with all interested and/or short-listed bidders present.
What it is? A physical or online meeting attended by those interested in bidding for the
work.
When does it Usually, in advance of the formals ToRs being published, if allowed by
happen? tendering procedures.
What is the Typically donors will start by presenting draft ToRs. These may or may not be
format? shared in advance. If an Expression of Interest (EoI) was issued, it may be
sensible to show the basic ToRs likely shared at that point.
Who is involved? Any TPM implementer – depending on how large a field the donors wants to
invite. Donors may choose to invite only those who have passed a ‘first round’
of consideration, perhaps through the submission of EoI that looks at basic
criteria such as track record, financial health, presence on the ground.
Why it is useful? It provides a chance for donors and TPM implementers to genuinely discuss
the issues at hand. It can greatly increase the chances of donors feeling that
the formal, final ToRs are well constructed. It gives TPM implementers a
chance to ask questions that can help them to reduce their costs.
Any risks? TPM Implementers feel these meetings can be less useful or less appealing to
them when the format tries to oblige or coerce the attendees to share their
ideas, and so lose their competitive edge. Large LAs have often established
leadership positions by developing intellectual property around TPM which
they want to protect.
26
4.4 Handling questions
Both donors and TPM implementers understand that – no matter how thorough the ToRs are –
bidders will want to ask questions during the bidding process. TPM Implementers perceive that
donors commissioning TPM generally do invite the asking of questions, which is welcomed. Without
being asked to draw comment on the specific processes and requirements that individuals donors
have in place for tendering and bidding TPM projects, TPM implementers were asked for ideas for
how the process for asking and answering of questions could be improved, and have these
suggestions:
A commitment to a transparent process in which responses to all questions are shown to all
bidders
Donors allowing at least a 2-week window in which questions are asked and answered
Donors aiming to respond to each question within 3 working days
Consideration of a web-based multi-user interface to (i) allow for clear communication of the
answers and reduce reliance on emails (which can cause confusion around the most recent
response to each question) and (ii) reduce the risk of donors receiving the same question
from multiple bidders
Allowing follow-up questions (within the pre-agreed timeframe).
27
5 Inception phase
Nearly all TPM projects have to adapt considerably from their initial
design – as these are highly complex environments. Piloting the
methodology is vital, and if the donor team is going to invest heavily
at any point in the process, it should be here.
5.1 The need for Inception phases
Most donors and TPM implementers are used to the idea of pilot or inception phases, and these are
seen by TPM implementers and donors as vital for TPM projects also. Respondents felt that inception
phases are vital because:
The basic context in which the TPM operates is so volatile that even the best-informed
bidder will not be able to put forward a
planned methodology in which having full Good documentation
confidence;
It is plausible that bidders did not genuinely For a TPM implementer to be ready to
have enough capacity to think through go into the field, they need to have in
every aspect of their proposal, and they will place a set of documents that will
need more time to test their own assertions ensure that everyone – and the
in the proposal; and monitors especially – has the same
TPM implementers admit that they may thorough understanding of their roles
sometimes put innovative ideas into their and responsibilities. It can be tempting
proposals that are included to attract for an implementer to let these
donors’ attention and help them win – these documents evolve during the inception
cannot always be fully thought-through or phase. Experience shows it is better to
fully-costed out, and this needs to be push for TPM partners to prepare a full
acknowledged and explored together with ‘V1’ set of these documents before
the donor. fieldwork, and then to formally review
them after a reasonable number of
TPM implementers understand that donors need to visits is undertaken.
ensure that the procurement process is fair, which
Also, since these documents can be so
means in effect ‘keeping bidders at arm’s length’
time-consuming to make, it is a good
during the process. A side-effect of this approach
idea to have them in place during the
however is that – as a result - there tends to be no
relative calm before implementation.
substantive direct contact between donor and
bidder at any point in the process (unless there is an Methodology manual
EME). In turn, this can result in dissonance or even Monitor manuals
misunderstanding between the ‘winning’ bidder and Health, Safety & Security manual
donor, especially if the donor lacks capacity in the Monitor training plan
weeks after the bid is won.
TPM implementers then hope to have the opportunity for a highly collaborative approach to
inception, during which they and the donor acknowledge that they have (typically) arrived at this
point with limited contact, and need to spend time together and have an open, candid discussions
about the proposal.
In the TPM field, unlike some others industries (such as advertising) where large contracts are
tendered for, bidding companies are not paid for their time in preparing the proposal. As a result,
they have no choice but to absorb the cost of bidding. They do not have limitless capacity, and
28
cannot be expected to put forward a ‘perfect’ methodology. Therefore, at the point at which a
tender is awarded, it is obvious that the winning company itself may well be aware that it might have
gone further, been clearer, or costed more precisely, some aspects of its bid.
Stage Content
Provisional One of the sensitivities of this part of the process is a possible tension between (i) the
1 award ‘winning’ bidder being announced and (ii) a likely need to make concrete changes to the
methodology that may result in additional costs. Specifically, this situation can result in
the LA being asked to provide further quotations for costs what has by then become a
non-competitive setting. For this reason, it may be the case that donors will want to
retain some control in the process by awarding only a ‘provisional winner’ and
announcing a final decision only once a full dialogue has been undertaken. This allows
both parties an opportunity to make sure that they are a good fit, and to ‘iron out
creases’ in the putative methodology. Such a phase is also useful as it allows for appeal
against the decision (if the donor operates such a system). In any case, at this point, IPs
should be made aware of the provisional award, and ideally they would be informed
ahead of any public announcement, as a courtesy.
Cost As described above, it may be in the interest of the donor, having nominated the
2 clarification ‘provisional’ winner, to enter into detailed discussions about the proposal and the costs,
> formal inviting reflections from the winning bidder about anything that they were unsure
awarding. about. The donor may want to bring in MEL experts to scrutinise bids, possibly to help
the donor team to ask questions to make sure that every aspect of the financial side of
the project has been well thought-through.
Methodology Once the financials are complete, and the donor has confidence that there are no
3 development surprises in terms of finances, and the formal award has been made, the donor and LA
should come together and discuss in detail the methodology. If viable, follow-up
technical meetings may be agreed to finalise these. A suite of tools should be
developed; this may take some weeks to prepare. An initial communications plan may
also be drafted during this time. It is also sensible to consider a detailed documentation
of the steps of fieldwork including precisely who is expected to do what, and when.
IP Once the ‘pilot-methodology’ is readied, it should be shared with the IPs. This is a crucial
4 Engagement moment in the chronology of the project – it is the first time that the donor will present
to the IPs the real nature of the project. IPs may have many useful ideas about the
proposed methodology – they may well want to share these. This engagement should
be done in such a way as to send a signal to the IPs that the donor treats both parties
equally and requires a proactive and cohesive approach. This stage may include one-to-
one meetings, a workshop or a combination of both, whether in person or online.
Fieldwork With the methodology now endorsed by the IPs, and the expectations on them clear, a
5 begins first visit to the field should be undertaken. This would ideally be undertaken with more
experienced monitors (who are briefed to feed back in a fulsome way) and with plenty
of time to reflect afterwards on potential improvements. Once this is done, a second
and if needed third wave of piloting is recommended, testing also a frequency of
fieldwork that it likely to reflect normal speed of operation.
Finalising of Once the team believes that ample fieldwork has been undertaken, all tools should be
6 methodology revised and formalised with donor approval. A formal ‘methodology’ handbook should
be approved and signed off by the donor. This should include clarity on all important
process such as scoring and definitions (see Implementation section).
29
Through all of the above, TPM practitioners emphasise that this Inception phase provides a crucial
first opportunity for coalescing of the ‘full team’ that is the donor, the LA and the IP. In this sense it is
vital that the donor sets the right tone, and makes time available, including by senior colleagues.
IPs are busy and it may help them to be presented with a very clear plan, which aims to
minimise the logistical burden on them
TPM reports are often sensitive in nature
It can happen that the DC finds an urgent problem (See Section 5 for more on ‘Red Flags’)
and clear communications is needed for such situations
In the more fragile settings, where monitor security is a concern, clear communications plans
are essential to make it clear who will play what role in an emergency
That said, not all projects are thought to require such a plan, and it may be a good strategy to wait
and see if this is even needed. Experience shows however that TPM of larger programmes, of more
sensitive work, or ones where more challenging information may be expected to be shared
(especially between LA and IP) may benefit particularly.
Be as concise as is reasonable
Be scenario-specific
Account for all scenarios that relate to safety
Involve senior decision makers.
30
“TPM will only work if the
donor sets the right tone –
explains with conviction why
this is important, and what
everyone needs to do and not
do to make it work”
Implementer
6 Implementation phase
During implementation, frequency of communication between all
parties is crucial; try to emphasise the need for open, frequent
communication around a clear fieldwork plan, and to instil a common
appreciation of the main aim – performance improvement to the
benefit of all
6.1 The typical process
The implementation of TPM typically involves months or years of interaction between the donor, LA,
DC and IPs. As described in Section 2, there are many types of TPM. In this section, we will look at
how implementation may look for two of the principal forms of TPM; that undertaken with
beneficiaries, and of MEL systems.
31
Figure 10: Typical TPM implementation process; with beneficiaries, and of MEL systems
32
6.2 Methodological options – pros and cons of each
Donors and TPM implementers both understand that primary (qualitative and quantitative) research
is often a necessary aspect of TPM. The table below expands on perceptions of each and their utility
in the field of TPM.
33
Figure 12: Ways to strengthen relationships between actors
34
Financial matters often run at the heart of this relationship. The LA will have won the project either
at a margin that it is pleased with or, perhaps in order to gain prominence in the field, at a lower
margin, and so may be sensitive to any new work or requests that are not funded. It is essential then
that the donors and LA talk openly about finances, and although the profitability of the LA is no
concern of the donor, it will be beneficial if both parties can achieve an open and frank discussion on
the underlying state of finances. Ideally, LAs are invited to explain what their cost drivers are, and for
a conversation to be ongoing around how both parties are seeing the financial side. The degree to
which budgets are flexible – i.e. the donor’s tolerance for the LA to make unilateral decisions, should
also be made clear.
35
6.3.4 Relationship the LA and the IP
There is a clear potential for tension between the LA and the IP, as the latter is being evaluated by
the former. The donor can assist this relationship by:
Communicating clearly and as soon as possible about the purpose of the TPM programme,
and how the information will be used, ideally as soon as the programme is decided upon.
This may involve the use of a workshop MEET THE MONITOR - TYPICAL DAY OF A
with other IPs. THIRD-PARTY MONITOR INSIDE SYRIA
Ensuring that any scoring (See Section 6)
e.g. any RAG-ratings are well defined and 7am – I woke up and checked Twitter and
justified, and that the tools that the Signal to see what’s happening around my
programme utilises are well thought- area, to see if it’s safe to head out. Yes,
through. there have been a few strikes over night,
Setting up processes and meetings during but nothing on the route I plan to take.
the course of the TPM programme in which
the donor actively plays a role in 8am – I called my supervisor, as agreed. We
emphasising the importance of TPM, and compared notes about the strikes, we’d
the creation of a culture of feedback and heard slightly different things, and we
learning, to improve performance. agreed to message a few more people
before we decide whether it’s safe for me
6.3.5 Relationship the IPs and the DC to head out and do the visit or not.
Relationships between the IP and the DC centre 9am – I should have left already but needed
around the visit itself. Often the IP’s staff at the that time to make sure things are safe – I
location to be visited will be very busy, and may feel they are and so does my line manager
struggle to guarantee being free during the visit and male colleague.
itself. The donor can help by stressing the
importance of the IP’s management asking the local 9.30am – Say goodbye to family and head
team to be free, and urging them to make sure that to the destination. I remind myself how to
to the extent possible, all key stakeholders are delete all my data if I get stopped at a
present, and that key activities can be observed. checkpoint.
36
monitor every aspect of the process, and what they must do in certain settings, such as if they are
asked who the donor is, or if they are questioned at a checkpoint. Technology can be used to help
monitors – for example there are now software that can hide the apps if monitors are stopped by
the authorities or threatened.
No matter how experienced they are, monitors will crave high quality briefings and trainings. These
are essential and donors may want to ensure that these are well executed. Trainings typically take
two forms – initial ‘generalised’ training on the project, and safety protocols and ‘visit specific’
training on the IP and their work, the tools, and what to expect on the day. Donors can help by
reviewing the materials, or even joining some trainings to make sure they are to standard.
Respondents were asked about accountability to communities and came up with the following
points.
1. Above all, think in terms of monitor safety, especially in conflict-affected areas, making sure that
sensitive data, such as the locations of targeted hospitals are never placed online unless in a
highly secure, regimented way.
2. Prioritise gender and social inclusion when discuss the methodology (including sampling) with
partners. It is challenging to reach vulnerable groups, but this has to be taken by all as a priority.
3. Ensuring that consent is gained and signed-for for any interview or action is essential. If monitors
use and carrying devices, this can be done on the screen or using the audio function.
4. Ensure that your TPM provider and your own team are aligned and robustly adopting best
practice with regards to conflict sensitivity. This should focus on monitors being alert to conflict
risks in the community, but also includes being alert to the conflict risks presented by the TPM
itself.
5. Ensuring that any questionnaires are as short as possible is key. It can be useful for the donor
team to go through the questionnaire piloting process, putting themselves in the shoes of being a
beneficiary or community member.
6. Keeping sample sizes to a practical minimum is important because communities often feel
inundated by such interviews. Take advice from experts on sampling, but also use your own
instinct – statistical reliability is important, but so is the impact of your TPM on communities and
the time they dedicate to helping you.
7. Providing feedback on findings to communities where possible – this may or may not be wanted
or sensible (depending on the topic), and where it is wanted, it may require a little creativity,
such as finding a pre-existing meeting where the community comes together, at which to share
37
findings verbally. A brave donor may consider in particular sharing learnings and decisions around
what it can do better in a given area. An abridged version of a report can also be considered.
8. Liaising with other donors to reduce the overall burden on communities is also worthy of
consideration. It is unlikely that cooperation of this kind will eliminate the need to go to a
particular community, but sharing data, and so knowing some general facts may allow you to
shorten questionnaire lengths, or to understand the attribution of your intervention.
9. Ensuring that GDPR practices are followed; put someone in charge – inside the TPM implementer
– of rigorously following GDPR.
10.Ensure that data is end-to-end encrypted wherever necessary. In some settings malicious actors
may benefit from accessing data such as the location of projects. Most TPM providers are not
using end-to-end encryption; while this is justifiable in certain settings in others it can create
unreasonable risk.
6.6 Technology
Technology can be used in a range of ways to assist TPM.
Much of the costs and time included in larger-scale TPM relates to the deployment of monitors to
engage with beneficiaries in remote areas. A company called GeoPoll has established direct relations
with telecoms providers allowing the pre-targeting of community members, which may be useful for
broad-scale surveys at a fraction of the cost and time required by conventional surveys, albeit with
reduced reliability of sampling quality. Meanwhile the establishment of call centres for phone
surveys is now far cheaper that it was ten years ago, it was said.
Big data
‘Big Data’ may be defined as the secondary analysis of very large datasets. At present, its usage in
the field of TPM appears negligible, driven by factors such as interoperability (Price, 2018). However,
the influence of Big Data on MEL and TPM may be significant. This ‘real-world’ data, for example
from mobile phone usage, can provide telling insights into the actual social outcomes of
interventions. For example, it is easy to imagine phone usage being taken as a proxy in a given
geographical area for the impact of an infrastructure project. For now, donors may instead want to
38
encourage LAs to make sure that they make reasonable effort to triangulate their findings with other
available data sources.
Aerial options
Tracking devices
For verification activities, in particular with regards to high-value assets, the attachment of Radio
Frequency Identification (RFID) or similar devices can be considered in place of the deployment of
monitors.
39
7 Analysis and reporting
Reporting should ideally have been tried and tested before and
during Inception. Beyond the basics considerations of OECD-DAC and
Red Amber Green (RAG) ratings, it’s key to think about
contextualising and sharing of findings, i.e. to consider the use of
dashboards, and how potentially to collaborate and share with other
donors.
7.1 Options for reporting
While reporting will be very-much situation-specific,
respondents were able to express the following general “Getting data is
advice:
only half the battle;
Think carefully about the balance of frequency vs.
depth; many reports are too thick and too old the challenge is
Try to imagine or even draft the report as early in getting actionable
the process as possible – it can be highly illuminating
to put oneself through the process and can greatly intelligence”
enhance the ToR
Implementer
Speak to colleagues in other regions first to see if
they have templates that you can use, or that help
you to understand better what you really want the reports to say
Do include photos as these can drive credibility and interest, and make all the difference in
driving traction of the report with key stakeholders
Think about the comparability of your data – are there other data sets that you would want
to compare to? How about disaggregation of data? Is showing age and gender breakdown
enough, or can more be done?
Consider asking your agency for a ‘video-presentation’ so you can share the findings in a
more visceral way
Try to retain consistency in the report writer used, insisting on one person for regional work,
for example.
40
Respondents said that the key to using RAG-rating are that:
Annex 3 includes a structure for a standard TPM report which blends RAG rating and OECD-DAC
criteria.
OECD-DAC CRITERIA
These six criteria have become a backbone of evaluation and are often used in TPM, regardless of it
being a form of monitoring (rather than evaluation) because they nonetheless provide a straight-
forward and meaningful perspective on what needs to change.
While these six are an established and widely-used set of lenses on performance, donors typically
feel free to adapt these also to the circumstances. For example, value for money may be included.
Or, depending on the project, it may be considered useful to include as standard ‘cross-cutting’
elements such as gender and social inclusion or conflict sensitivity.
41
Figure 14: Reporting dashboard example
42
• In an effort to drive some efficiencies,
donors share, in a systematic way, whom
1. Awareness only they plan to monitor, when and where.
No explicit intention to share data
presumed.
43
ANNEXES
ANNEX 1 – Bibliography
Chaudhri, S., Cordes, K., & Miller, N. (2017). Humanitarian Programming and Monitoring in
Inaccessible Conflict Settings: A Literature Review. Health Cluster. World Health Organisation.
https://www.researchgate.net/profile/Nathan_Miller5/publication/316693277_Humanitarian_progr
amming_and_monitoring_in_inaccessible_conflict_settings_A_literature_review/links/590ceccfaca2
7 22d185c150b/Humanitarian-programming-and-monitoring-in-inaccessible-conflict-settings-
Aliterature-review.pdf
Corlazzoli, V. (2014). ICTs for Monitoring and Evaluation of Peacebuilding Programmes. DFID:
Department for International Development. https://www.sfcg.org/wp-
content/uploads/2014/05/CCVRI-SSP-_ICT-and-ME-_Final.pdf
DCAF (2015) ‘Armed Non-State Actors: Current Trends & Future Challenges DCAF & Geneva Call’;
available online at: https://www.dcaf.ch/sites/default/files/publications/documents/ANSA_Final.pdf
Dette, R., Steets, J. & Sagmeister, E. (2016) Technologies for Monitoring in Insecure Environments.
Secure Access in Volatile Environments (SAVE) Toolkit. Global Public Policy Institute.
http://www.gppi.net/fileadmin/user_upload/media/pub/2016/SAVE__2016__Toolkit_on_Technolo
gi es_for_Monitoring_in_Insecure_Environments.pdf
Herbert, S. (2013). Remote management of projects in fragile states (GSDRC Helpdesk Research
Report 908) Birmingham, UK: GSDRC, University of Birmingham.
https://assets.publishing.service.gov.uk/media/57a089ffed915d3cfd00052a/hdq908.pdf
Kelly, L & Gaarder, M (2017) World Bank. https://ieg.worldbankgroup.org/blog/third-party-
monitoring-volatile-environments
Price, R (2018) ‘Approaches to Remote Monitoring in Fragile States’; available online at
https://gsdrc.org/wp-content/uploads/2018/01/1420-Remote-monitoring-in-fragile-states.pdf
Rivas, A., Guillemois, D., Rzeszut, K., and Lineker, B. (2015). Cross Cutting Evaluation of DFID’s
Approach to Remote Management in Somalia and North-East Kenya, Evaluation Report, DFID:
Department for International Development, London: Integrity Research and Consultancy.
https://www.gov.uk/government/publications/cross-cutting-evaluation-of-dfids-approach-to-
remote-management-in-somalia-and-north-east-kenya
Sagmeister, E. & Steets, J. with Derzsi-Horváth, A., & Hennion, C. (2016). The use of third-party
monitoring in insecure contexts: Lessons from Afghanistan, Somalia and Syria. Resource Paper from
the Secure Access in Volatile Environments (SAVE) research programme.
http://www.gppi.net/fileadmin/user_upload/media/pub/2016/SAVE__2016__The_use_of_thirdpart
y_monitoring_in_insecure_contexts.pdf
Taptue, A.M. & Hoogeveen, J. (2017, November 2). Project monitoring in fragile places does not
have to be expensive. World Bank blog post. https://blogs.worldbank.org/nasikiliza/project-
monitoring-in-fragile-places-does-not-have-to-be-expensive
United Nations, Third Party And Collaborative Monitoring (2015),
https://www.alnap.org/system/files/content/resource/files/main/third-party-and-collaborative-
monitoring-pv1.pdf
van Beijnum, M, van den Berg, W and van Veen, E (2018). Between a rock and a hard place;
Monitoring aid implementation in situations of conflict. Clingendael Institute.
https://www.clingendael.org/pub/2018/between-a-rock-and-a-hard-place/
44
ANNEX 2 – List of contributors
Name Organisation
Bruno Kessler Altai
Dhanya Williams Altai
Eric Davin Altai
Jeremie Toubkiss Altai
Justine Rubira Altai
Kamran Parwana Altai
Giorgio Saad Aktek
Dominic d'Angelo BDO UK
André Kahlmeyer CNC
Ribotaan Roy Coffey
Clare Winton DFID
Cyril Perus ECHO
Pedro Luis Rojo Garcia ECHO
Olivier Rousselle ECHO
Justin Ormand Ecorys
Cecile Delhez EU DEVCO
Milena Isakovic Suni EU DEVCO
Marcia Kammitsi EU FPI
Lea Tries EU FPI
David Bouanchaud EU FPI
Aminata Mar Thiem EU FPI
Marie-Luise Schwarzenberg EU FPI
Cedric Pierard EU FPI
Janine Abou Azzam EU FPI
Helga Pender EU FPI
Andy McLean First Call Partners
Michael Shaw Independent Consultant
An Hutton Independent Consultant
Kathryn Rzeszut Integrity International
Tom Gillhespy ITAD
Lameck Odallo Kimetrica
Philibert de Mercy Masae
Gilles Morain Masae
Gretchen Severson Burnham Global
Johnny Heald ORB
Sonya Schmidt Palladium
Victor Henriette Particip
Cecile Collin Particip
Marwa Bouka RMTeam
Bassam Al-Kuwatli RMTeam
Dr Althea-Maria Rivas University of Sussex
Travis Mayo USAID's Bureau for Policy, Planning and Learning
45
ANNEX 3 – Exemplar report structure
The structure below is one that is suitable for a TPM report describing the findings from a visit to an
implementation location in order to assess performance.
1 Context To provide the reader This section should provide the necessary
with the background background on: (i) the rationale for the TPM
they need to make itself (ii) the environment in and around the
best use of the report. location (maps may be useful here) including the
security situation and role of any key actors, (iii)
the aims of the IPs work (iv) any challenges that
the IP has experienced (v) a summary of any
existing MEL data, and (vi) the objectives of the
visit, assuming already agreed.
3 Main To provide the detail Typically this will follow the OECD-DAC structure
findings on what happened (relevance, coherence, effectiveness, efficiency,
during the visit. impact and sustainability), adapted as the donor
sees fit.
4 Annexes To supply any other This should include the tool/s (discussion guide,
pertinent additional questionnaire etc.) used and any other materials
information. that may be pertinent, such as (i) relevant news
article/s (ii) names of interviewees, if consent
was given, and (iii) photographic evidence
supporting contentions in the report.
46