Results-Based Management (RBM) Guiding Principles
Results-Based Management (RBM) Guiding Principles
Results-Based Management (RBM) Guiding Principles
Guiding Principles
INDEX
1. PREFACE ................................................................................................................................... 4
2. BRIEF HISTORICAL BACKGROUND ................................................................................. 5
3. WHAT IS RBM?......................................................................................................................... 6
4. WHAT IS A RESULT? .............................................................................................................. 8
5. HOW TO FORMULATE AN EXPECTED RESULT?........................................................... 8
6. WHAT IS THE RELATIONSHIP BETWEEN INTERVENTIONS, OUTPUTS AND
RESULTS? ........................................................................................................................................ 12
7. MONITORING OF IMPLEMENTATION ........................................................................... 15
8. ANNEXES ................................................................................................................................. 23
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 3
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
1. PREFACE
It is said that if you do not know where you are going, any road will take you there. This lack of
direction is what results-based management (RBM) is supposed to avoid. It is about choosing a
direction and destination first, deciding on the route and intermediary stops required to get there,
checking progress against a map and making course adjustments as required in order to realise the
desired objectives.
For many years, the international organizations community has been working to deliver services and
activities and to achieve results in the most effective way. Traditionally, the emphasis was on
managing inputs and activities and it has not always been possible to demonstrate these results in a
credible way and to the full satisfaction of taxpayers, donors and other stakeholders. Their concerns
are straightforward and legitimate: they want to know what use their resources are being put to and
what difference these resources are making to the lives of people. In this line, RBM was especially
highlighted in the “2005 Paris Declaration on Aid Effectiveness” as part of the efforts to work
together in a participatory approach to strengthen country capacities and to promote accountability
of all major stakeholders in the pursuit of results.
It is usually argued that complex processes such as development are about social transformation,
processes which are inherently uncertain, difficult, not totally controllable and - therefore - which
one cannot be held responsible for. Nonetheless, these difficult questions require appropriate
responses from the professional community and, in particular, from multilateral organizations to be
able to report properly to stakeholders, and to learn from experience, identify good practices and
understand what the areas for improvements are.
The RBM system aims at responding to these concerns by setting out clear expected results expected
for programme activities, by establishing performance indicators to monitor and assess progress
towards achieving the expected results and by enhancing accountability of the organization as a
whole and of persons in charge. It helps to answer the “so what” question, recognizing that we
cannot assume that successful implementation of programmes is necessarily equivalent to actual
improvements in the development situation.
This paper is intended to assist in understanding and using the basic concepts and principles of
results-based management.
General information on RBM concept in this document is based on materials of some UN agencies.
Bearing in mind that different RBM terminology is used by different actors due to each and
everyone's specific context, it is important to ensure that the definitions are clear to all involved in
the result based management within an organization. Inspite of different terminology used by
different actors, the chain itself proceeds from activities to results/impacts.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 4
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
2. BRIEF HISTORICAL BACKGROUND
As such, the concept of RBM is not really new. Its origins date back to the 1950’s. In his book “The
practice of Management”, Peter Drucker introduced for the first time the concept of “Management
by Objectives” (MBO) and its principles:
As we will see further on, these principles are very much in line with the RBM approach.
MBO was first adopted by the private sector and then evolved into the Logical Framework
(Logframe) for the public sector. Originally developed by the United States Department of Defense,
and adopted by the United States Agency for International Development (USAID) in the late 1960s,
the logframe is an analytical tool used to plan, monitor, and evaluate projects. It derives its name
from the logical linkages set out by the planners to connect a project’s means with its ends.
During the 1990s, the public sector was undergoing extensive reforms in response to economic,
social and political pressures. Public deficits, structural problems, growing competitiveness and
globalization, a shrinking public confidence in government and growing demands for better and
more responsive services as well as for more accountability were all contributing factors. In the
process, the logical framework approach was gradually introduced in the public sector in many
countries (mainly member States of the Organization for Economic Co-operation and Development
(OECD). This morphed during the same decade in RBM as an aspect of the New Public
Management, a label used to describe a management culture that emphasizes the centrality of the
citizen or customer as well as the need for accountability for results.
This was followed by the establishment of RBM in international organizations. Most of the United
Nations system organizations were facing similar challenges and pressures from Member States to
reform their management systems and to become more effective, transparent, accountable and
results-oriented. A changeover to a results-based culture is however a lengthy and difficult process
that calls for the introduction of new attitudes and practices as well as for sustainable capacity-
building of staff.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 5
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
3. WHAT IS RBM?
Results -based management (RBM) can mean different things to different people/organizations. A
simple explanation is that RBM is a broad management strategy aimed at changing the way
institutions operate, by improving performance, programmatic focus and delivery. It reflects the way
an organization applies processes and resources to achieve interventions targeted at commonly
agreed results.
To maximize relevance, the RBM approach must be applied, without exceptions, to all
organizational units and programmes. Each is expected to define anticipated results for its own
work, which in an aggregative manner contributes to the achievement of the overall or high-level
expected outcomes for the organization as a whole, irrespective of the scale, volume or complexity
involved.
RBM seeks to overcome what is commonly called the “activity trap”, i.e. getting so involved in the
nitty-gritty of day-to-day activities that the ultimate purpose or objectives are being forgotten. This
problem is pervasive in many organizations: project/programme managers frequently describe the
expected results of their project/programme as “We provide policy advice to partners”, “We train
journalists for the promotion of freedom of expression”, “We do research in the field of fresh water
management”, etc., focusing more on the type of activities undertaken rather than on the ultimate
changes that these activities are supposed to induce, e.g. in relation to a certain group of
beneficiaries.
An emphasis on results requires more than the adoption of new administrative and operational
systems, it needs above all a performance-oriented management culture that supports and
encourages the use of new management approaches. While from an institutional point of view, the
primordial purpose of the RBM approach is to generate and use performance information for
accountability reporting to external stakeholders and for decision-making, the first beneficiaries are
the managers themselves. They will have much more control over the activities they are responsible
for, be in a better position to take well-informed decisions, be able to learn from their successes or
failures and to share this experience with their colleagues and all other stakeholders.
The formulation of expected results is part of an iterative process along with the definition of a
strategy for a particular challenge or task. The two concepts – strategy and expected results - are
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 6
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
closely linked, and both have to be adjusted throughout a programming process so as to obtain the
best possible solution.
In general, organizational RBM practices can be cast in twelve processes or phases, of which the
first seven relate to results-oriented planning.
1) Analyzing the problems to be addressed and determining their causes and effects;
4) Identifying performance indicators for each expected result, specifying exactly what is to
be measured along a scale or dimension;
5) Setting targets and benchmarks for each indicator, specifying the expected or planned
levels of result to be achieved by specific dates;
6) Developing a strategy by providing the conceptual framework for how expected results
shall be realized, identifying main modalities of action reflective of constraints and
opportunities and related implementation schedule;
7) Balancing expected results and the strategy foreseen with the resources available;
9) Reporting and self-evaluating, comparing actual results vis-à-vis the targets and reporting
on results achieved, the resources involved and eventual discrepancies between the
“expected” and the “achieved” results;
10) Integrating lessons learned and findings of self-evaluations, interpreting the information
coming from the monitoring systems and finding possible explanations to eventual
discrepancies between the “expected” and the “achieved”.
11) Disseminating and discussing results and lessons learned in a transparent and iterative
way.
12) Using performance information coming from performance monitoring and evaluation
sources for internal management learning and decision-making as well as for external
reporting to stakeholders on results achieved.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 7
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
4. WHAT IS A RESULT?
A result is the “raison d’être” of an intervention. A result can be defined as a describable and
measurable change in state due to a cause and effect relationship induced by that intervention.
Expected results are answers to problems identified and focus on changes that an intervention is
expected to bring about. A result is achieved when the outputs produced further the purpose of the
intervention.
It often relates to the use of outputs by intended beneficiaries and is therefore usually not under full
control of an implementation team.
Formulating expected results from the beneficiaries’ perspective will facilitate focusing on the
changes expected rather than on what is planned to be done or the outputs to be produced. This is
particularly important at the country level, where UNESCO seeks to respond to the national
development priorities of a country. Participation is key for improving the quality, effectiveness and
sustainability of interventions. When defining an intervention and related expected results one
should therefore ask:
Who participated in the definition of the expected results?
Were key project stakeholders and beneficiaries involved in defining the scope of the project
and key intervention strategies?
Is there ownership and commitment from project stakeholders to work together to achieve
identified expected results?
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 8
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
Use “change” language instead of “action” language
The expected result statement should express a concrete, visible, measurable change in state or a
situation. It should focus on what is to be different rather than what is to be done and should express
it as concretely as possible. Completed activities are not results, results are the actual benefits or
effects of completed activities.
… can often be interpreted in many ways: ... sets precise criteria for success:
Although the nature, scope and form of expected results differ considerably, an expected result
should meet the following criteria (be “SMART”):
Specific: It has to be exact, distinct and clearly stated. Vague language or generalities are not
results. It should identify the nature of expected changes, the target, the region, etc. It should
be as detailed as possible without being wordy.
Measurable: It has to be measurable in some way, involving qualitative and/or quantitative
characteristics.
Achievable: It has to be achievable with the human and financial resources available
(‘realistic’).
Relevant: It has to respond to specific and recognized needs or challenges and to be within
mandate.
Time-bound: It has to be achieved in a stated time-frame or planning period.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 9
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
Once a draft expected results statement has been formulated, it is useful to test its formulation going
through the SMART criteria. This process enhances the understanding of what is pursued, and is of
help in refining an expected result in terms of their achievability and meaningfulness.
Example: if we consider a work plan to be undertaken in a specific country that includes the
expected results statement “Quality of primary education improved”, the application of the SMART
questioning could be as follows:
1. Is it “Specific”?
What does “quality” actually mean in this context? What does an “improvement” of quality in
primary education amount to concretely? Who are the relevant stakeholders involved? Are we
working on a global level, or are we focusing on a particular region or country?
In responding to the need of being specific, a possible expected result formulation could finally
be:
“Competent authorities in Country X adopted the new education plan reviewed on the basis of
international best practices and teachers and school personnel implement it.”
2. Is it “Measurable”?
Can I find manageable performance indicators that can tell about the level of achievement?
Possible Performance Indicators could be:
- % of teachers following the curriculum developed on the basis of the new education plan
(baseline 0%, benchmark 60%)
- % of schools using quality teaching material (baseline 10%, benchmark 90%)
3. Is it “Achievable”?
Do I have enough resources available to attain the expected result? I need to consider both
financial and human resources. If the answer is negative, I have to either reconsider and adjust
the scope of the project or mobilize additional resources.
4. Is it “Relevant”?
Is the expected result coherent with the upstream programming element based on. of its
domains?
If the answer is negative I should drop the activity.
5. Is it “Time-bound”?
The expected result should be achievable within the given timeframe for programming processes
this period.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 10
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
Improving the results formulation: the SMART process
Competent authorities in Country X adopted the new education plan reviewed on the basis of international
best practices and teachers and school personnel implement it
Performance indicators:
– % of teachers following the curriculum developed on the basis of the new education plan (baseline 0%,
benchmark 60%)
– % of schools using quality teaching material (baseline 10%, benchmark 90%)
Once an intervention is formulated, it can be useful to check and improve its design against yet
another concept – namely, establishing a balance between three variables Results (describable and
measurable change in state that is derived from a cause and effect relationship), Reach (the breadth
and depth of influence over which the intervention aims at spreading its resources) and Resources
(human, organisational, intellectual and physical/material inputs that are directly or indirectly
invested in the intervention).
Unrealistic project plans often suffer from a mismatch among these three key variables. It is
generally useful to check the design of a project by verifying the three Rs by moving back and forth
along the project structure and by ensuring that the logical links between the resources, results and
the reach are respected.
It is rather difficult to construct a results-based design in one sitting. Designs usually come together
progressively and assumptions and risks have to be checked carefully and constantly along the way.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 11
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
6. WHAT IS THE RELATIONSHIP BETWEEN INTERVENTIONS,
OUTPUTS AND RESULTS?
Interventions, outputs and results are often confused. Interventions describe what we do in order to
produce the changes expected. The completion of interventions leads to the production of outputs.
Results are finally the effects of outputs on a group of beneficiaries. For example, the
implementation of training workshops (activity) will lead to trainees with new skills or abilities
(outputs). The expected result identifies the behavioral change among the people that were trained
leading to an improvement in the performance of, say, an institution the trainees are working in,
which is the ultimate purpose of the activity.
If we move our focus from what we do to what we want the beneficiaries to do after they have been
reached by our intervention, we may realize that additional types of activities could be necessary to
make sure we will be able to achieve the expected results.
The following examples may help to understand the relationship between interventions, outputs and
results, but should not be seen as a generally applicable master copy as every intervention is
different from another.
UNESCO Prize for • Selection and • Jury nominated and • Concept of tolerance
Tolerance information of jury. agreeable to main spread among the
• Preparation of stakeholders. general public in a
brochures, information • Brochures, leaflets, country/region/globally.
material. videos produced and
• Development and disseminated.
organization of an • Information
Information campaign. campaign
• Advertising the prize. implemented
• Development of • List of candidates
partnerships for the completed and
identification of a list agreeable to main
of candidates. stakeholders.
• Organization of the • Prize-winner
award ceremony. nominated.
• Organization of press • Press Conference
conferences. organized and attended
• Follow-up and by identified
assessment of media journalists.
coverage. • Media coverage of
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 13
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
the event.
Increasing access to
quality basic • Discussions with • Principle agreement • The centre is
education for local authorities. by local authorities. operational and is an
children through • Assessing the • Feasibility study integral part of the
community learning feasibility of completed and gender community life.
centres community-based analysis produced and • Steps are taken by
learning centre in disseminated. local authorities for
community X. • Principle agreement replicating this
• Preliminary by community leaders. initiative in other
discussions with local • Community Centre communities.
stakeholders. proposal completed
• Sensitization and submitted to local
seminars for local authorities and
leaders and community community leaders.
members. • Personnel selected.
• Curriculum • Curriculum and
development of training material for
community-based community-based
learning centres. learning centres
• Selection of training developed.
personnel among the • Managers and
local community. teachers have the
• Adapting training necessary skills to
material implement their
• Training of functions.
personnel. • Brochures & videos
• Production of developed and
information material disseminated.
for local authorities. • Local leaders and
• Meetings with local community members
authorities for the informed, sensitized
replication of such and convinced.
centres.
• Provision of
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 14
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
technical assistance to
local authorities for
replicating such
centres in other
communities.
- The nature of expected results: it is obvious that the nature, magnitude, meaning of
“expected results” cannot be the same among the different levels. Nevertheless, it is
crucial that all these results build a chain of meaningful achievements, bridging the
gap between the mandate and the strategic objectives of the organisation actually
achieves in its daily operations.
- Reconciling global and local dimensions: RBM stresses results and greater focus; this
should be done without sacrificing the organisation’s global mandate and its
commitment to decentralisation and responsiveness to country needs and priorities: a
good balance has to be found between global and field-oriented approaches.
7. MONITORING OF IMPLEMENTATION
Monitoring can be described as “a continuing function that uses systematic collection of data on
specified indicators to provide management and the main stakeholders of an ongoing (…)
intervention with indications of the extent of progress and achievement of objectives and progress in
the use of allocated funds” (Source: OECD RBM Glossary).
The function of a monitoring system is to compare “the planned” with “the actual”. A complete
monitoring system needs to provide information about the use of resources, the activities
implemented, the outputs produced and the results achieved. What we are focusing on here is a
results-based monitoring system: at the planning stage, through its monitoring system, the officer in
charge has to translate the objectives of the intervention in expected results and related performance
indicators and to set the baseline and targets for each of them. During implementation, he needs to
routinely collect data on these indicators, to compare actual levels of performance indicators with
targets, to report progress and take corrective measures whenever required.
As a general rule, no extra resources (neither human, nor financial) will be allowed for monitoring
tasks, hence the responsible person has to ensure that these tasks can be undertaken with the budget
foreseen (as a general rule of thumb, about 5% of the resources should be set aside for this purpose).
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 15
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
that will allow to track progress and assess the effectiveness of our intervention, i.e. if it was capable
of producing the intended results. Indicators support effectiveness throughout the processes of
planning, implementation, monitoring, reporting and evaluation.
Indicators may be used at any point along the chain inputs, interventions, outputs, results, but a
results-based monitoring system does not address compliance to the rate of expenditure or to the
implementation plan (answering the question: “have we done it?”), but on the actual benefits that
our interventions were actually able to bring to the targeted populations (answering the question:
“we have done it, so what?”). Performance indicators are aimed at giving indications of change
caused or induced by the intervention. This core purpose does not require sophisticated statistical
tools, but reliable signals that tell, directly or indirectly, about the real facts on which one undertakes
to have leverage. A fair balance is to be sought between the cost - both in terms of time and money -
to collect the information required and its capacity to reflect the desired changes. Even a carefully
selected, clearly defined indicator is of little use unless it is actually put to use. A critical test of an
indicator is how practical it is to monitor. Thinking about an indicator is one thing, actually finding,
recording and presenting the data is another. Indicators need to be approached as a practical tool, not
merely as a conceptual exercise.
Performance indicators are signposts of change. They enable us to verify the changes the
interventions we are dealing with seek to achieve. The purpose of indicators is to support effective
programme planning, management and reporting. Indicators not only make it possible to
demonstrate results, but they can also help produce results by providing a reference point for
monitoring, decision-making, stakeholder consultations and evaluation.
We should bear in mind, however, that indicators are only intended to indicate, and not to provide
scientific “proof” or detailed explanations about change. In addition, we should avoid the temptation
to transform the measurement of change into a major exercise with a burdensome workload.
Measuring change should not take precedence over programme activities that generate the changes
to be measured.
The critical issue in selecting good indicators is credibility, not the number of indicators, nor the
volume of data or precision in measurement. The challenge is to meaningfully capture key changes
by combining what is substantively relevant with what is practically feasible to monitor.
At the end of the day, it is better to have indicators that provide approximate answers to some
important questions than to have exact answers to many unimportant questions.
Selecting substantively valid and practically possible performance indicators presupposes an in-
depth understanding of the situation and of the mechanisms subtending change. Therefore, the use
of pre-designed or standardised performance indicators is not recommended, as they often do not
address the specificities of the situation in which the intervention is carried out. Performance
indicators have to be designed on the basis of the ambition of an intervention, its scope and the
environment they are implemented in.
Failing to design good indicators often means that the results are not clearly defined or that they are
too wide-ranging. The process of selecting indicators can help identify the core issues of the
intervention and translate often intangible concepts into more concrete and observable elements.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 16
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
A result and its indicator should not be mixed up. The result is the achievement. Indicators should
tell about the achievement.
The term “capacity” in this framework refers to the abilities, skills, understandings, attitudes, values,
relationships, knowledge, conditions and behaviours that enable organizations, groups and
individuals in a society to generate benefits and achieve their objectives over time. Capacity also
reflects the abilities of these actors to meet the needs and demands of the stakeholders for whom
they were established or to whom they are accountable. These attributes cover formal, technical and
organizational abilities and structures and also the more human, personal characteristics that allow
people to make progress.
Signals and scales lend themselves to indicators that express qualitative and/or quantitative
information. Quantitative indicators are numerical. Qualitative indicators use categories of
classification, based on individual perceptions.
The concept of quantitative versus qualitative indicators has been a subject of frequent discussion
over the past few years. The common belief is that quantitative indicators are measurements that
stick to cold and hard facts and rigid numbers and there is no question about their validity, truth and
objectivity while qualitative indicators are seen as subjective, unreliable and difficult to verify. No
one type of indicator or observation is inherently better than another; its suitability depends on how
it relates to the result it intends to describe. There should be a shift away from the approach that
indicators should be quantitative rather than qualitative. It is expected to select the type of indicator
that is most appropriate for the result being measured. If a qualitative indicator is determined to be
most appropriate, one should clearly define each term used in the measure, make sure to document
all definitions and find possible ways (such as using rating scales) to minimize subjectivity.
For example, if the result under consideration is in the area of improving the functioning of the
government, in particular concerning its preparation to respond to local needs, we could measure the
degree of results achievement through indicators measuring the change in levels of end-user
approval (or client satisfaction).
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 17
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
Possible indicators could therefore be:
Qualitative indicators are particularly helpful - for example - when the actions involve capacity
development for service delivery. The perceptions of end-users regarding service delivery gets
straight to the issue of whether the services are wanted, useful and effectively delivered. The
satisfaction of end-users (or clients) has the advantage of some comparability. Results may be
compared and data disaggregated by kind of service, location, time, etc.
This approach is not without its problems, however. The only way of getting this information may
be through a survey that may reveal itself too costly, clients may not always be easy to identify, and
their perceptions of satisfaction with services is subject to influences other than the service itself.
Several types of performance indicators can be used to assess progress towards the achievement
of results:
Direct statistical indicators show progress when results are cast in terms of readily quantifiable
short-term changes. For example, if the result is “Nominations of cultural and natural properties
from regions or categories of heritage, currently under-represented or non-represented on the World
Heritage List increased”, it should not be difficult to secure direct quantifiable data about the
number of new nominations over the time period of a biennium (or less). Care must be taken to
ensure that the time span of the result lends itself to the collection of such data for use in review.
b) Proxy Indicators
Proxy indicators are normally quantitative, but do not directly relate to the result. A proxy is used to
show progress. It should be used when getting the full data is too time-consuming, or when the
timeliness of complete data would fall outside the need for review. However, there
must be a prima facie connection between the proxy and the result. For example if the result is
“Public recognition improved of the importance of the mathematical, physical, and chemical
sciences for life and societal development”, a good Proxy Indicator might be the improvement of the
media coverage concerning these issues.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 18
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
c) Narrative Indicators
When the results are not easily quantifiable (changing attitudes, building capacities, etc.) over the
time period of the biennium, and the number of recipients is not too big, a non-statistical approach
can be envisaged to develop an indication of “progress”. Narrative indicators largely focus on the
“process of change”.
This technique works especially well in instances where capacity building, training, conferences,
network development and workshops are the planned interventions. However, when dealing with
stakeholders, care needs to be taken to avoid a focus simply on “satisfaction”. Rather, the focus
should be on what happened (or at least on what the recipients have planned to do) as a result of the
intervention/participation. For example if the expected result is “National capacities in educational
planning and management strengthened”, then a valid narrative indicator might be a follow-up
questionnaire to be circulated among those individuals who participated in training, or conferences
or other activities to ask them what they did (or what they have planned to do) in their countries.
There are a number of risks when defining and using performance indicators. The most frequent are:
In the following table, possible performance indicators are added to the previously introduced
examples completing the RBM planning and monitoring cycle.
UNESCO Prize for • Jury nominated and • Concept of tolerance • Media coverage of
Tolerance agreeable to main spread among the the prize
stakeholders. general public in a
• Brochures, leaflets, country/region/globally.
videos produced and
disseminated.
• Information
campaign
implemented
• List of candidates
completed and
agreeable to main
stakeholders.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 20
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
• Prize-winner
nominated.
• Press Conference
organized and attended
by identified
journalists.
• Media coverage of
the event.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 22
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
8. ANNEXES
RBM GLOSSARY
DEFINITIONS ОПРЕДЕЛЕНИЯ
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 24
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by
the UNESCO Cluster Office in Almaty
SELF-EVALUATION: An evaluation by САМООЦЕНКА – оценка теми, кому
those who are entrusted with the design and/ or была доверена разработка, или
planning and/ or delivery of a programme, планирование, или осуществление
project, activity. программы, проекта, мероприятия.
FUNCTIONS: The range of functions that the ФУНКЦИИ - ряд функций, которые
organization performs. выполняет организация.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 25
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by
the UNESCO Cluster Office in Almaty
осуществления.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 26
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by
the UNESCO Cluster Office in Almaty
REACH: The people, groups or organizations ОХВАТ – люди, группы или
who will benefit directly or indirectly from, or организации, которые получат
who will be affected by the results of the прямую или непрямую выгоду или
intervention. те, на кого будут воздействовать
результаты интервенции.
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 27
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by
the UNESCO Cluster Office in Almaty
TARGETS: Quantitative and qualitative levels ЦЕЛИ – количественные и
of performance indicators that an intervention качественные уровни индикаторов
is meant to achieve at a given point in time качества выполнения того, что
(e.g. at the end of the biennium). должна достичь интервенция в
определенный момент времени
(например, в конце двухлетнего
периода).
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 28
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by
the UNESCO Cluster Office in Almaty
RBM MODULES
This document is based on the UNESCO Results-Based Programming, Management and Monitoring (RBM) 29
Guiding Principles, UNESCO Paris, Bureau of Strategic Planning, January 2008, and translated into Russian by the
UNESCO Cluster Office in Almaty
What is RBM?
a) A Management Tool
Background and b) A frame of mind
RBM Fundamentals c) RBM = Really Boring Monologue
d) A waste of time
Module 1
Reporting Continuous
and Review Learning
Implementation
Monitoring
Implementation Implementation
Monitoring Monitoring
Implementation Implementation
Monitoring Monitoring
Implementation Monitoring
Continuous Continuous
Reporting Reporting Learning
Learning
and Review and Review
• Activities
Undertaken Implementation
• Indicator
• Outputs
Monitoring Delivered Data Collected
& Analyzed
Presentation is based on UNIFEM materials Presentation is based on UNIFEM materials
Reporting
and Review
MfDR in the Management Cycle
Learning in MfDR
Using Performance Information
Analysis & Needs
Assessment
Strategic
• Often the last consideration, but should be the first
Planning Operational
Planning • The need for this information should drive the
MfDR process
Continuous
– Management
Reporting Learning
and Review – Decision-making
– Resource Allocation
Implementation – Learning
Monitoring – Reporting and Accountability
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Module Objectives
• To understand the different levels of results
Results Definition and linkages between them
• To understand and practice how to create
results chains
The Results Chain
• To learn how to represent complex results
Module 3 chains in a logic model or results tree
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The Results Chain
• A series of expected achievements, “linked” by
causality
• Continuum from inputs/resources to final impact
divided up into segments / links Inputs Activities Outputs Outcomes Impacts
• Expressed horizontally or vertically
• Each link in the chain is characterized by:
– Increased importance of achievement with respect to
the program goal
– Decreased control, accountability, and attribution
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Example:
More
Training in Use of RBM by
successful
RBM trainees
programming
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Capacity Building with Single
Wording should show change:
Organization • Improved
Outputs Outcomes Impacts • Increased technical capacity of …
Knowledge • Enhanced
Organizational Capacity Results of
Skills improved • Greater
Systems Performance Performance
• Higher
Processes
• Diminished
Policies
Example: • Presence / Absence
Enhanced ability to
Increased / improved
________ in program
manage program Improved program Should not use – through, for (in order to), by (how), ie
results for women
management Improved management no “causality” in the statement
of program
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
WHAT IS AN INDICATOR?
Indicators
Instrument to measure evidence of
progress towards a result or that a • Performance against results is measured
result has been achieved through the use of indicators
• Indicators play an important role by
Establishes the level of performance necessary
establishing the status of expected results
to achieve results
• Indicators tell us how we will know when
Specifies the elements necessary to establish we have been successful in progress
whether expected results were achieved towards results
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Performance and Results Levels Usefulness of Indicators
Stated Outcomes Tell us how we will recognize success
Goal/ Indicators
Impact Force us to clarify what we mean by our
objectives/expected result
Outcomes Indicators
Provide a measurable basis for monitoring
and evaluation
Outputs Indicators
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Quantitative vs. Qualitative
Qualitative Indicators
Indicators
Quantitative Qualitative Require further measurement criteria
• % of participants who • Congruence of policy • Congruence of policy changes with advocacy
are employed changes with advocacy messages
• # of women in messages – Reflection of key words in policy document
decision-making • Level of commitment to – Inclusion of key messages with different wording
positions CEDAW – Key messages taken into account but not fully adopted
• % of women • Quality of GEL • All these are different levels of congruence
parliamentarians formulation
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Measures, Indicators, Targets
Indicator Examples
Measures (The “What”)
Quantitative or qualitative result 80 % of NGO Recommendations Adopted
attributes that must be measured in • Measure
order to determine the performance of
a program or initiative.
Target Measure
– Satisfaction of citizen with the complaints mechanism
• Indicator
Indicators ( The “How”)
The quantification or qualification of a performance measure.
– % of citizen who are satisfied complaint mechanism
A statistic or parameter that, tracked over time, provides information on
trends in the condition of a phenomenon.
• Target
– Percentage of citizen who are very satisfied with
Targets or Standards (The “How Much”) complaint mechanism is 50%
Specific quantitative or qualitative goals against which actual
outputs or outcomes will be compared.
Targets imply a desired goal that may be more ambitious than a standard.
Targets are not appropriate for all types of indicators
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Examples Examples
• Measure • Measure
– Female ownership of land – Knowledge of NGO workers on advocacy campaign
tactics
• Indicator • Indicator
– % of land owned by females – Level of knowledge of NGO staff of 3 key advocacy
tactics
• Target
• Target
– Proportion of land owned by women is
– 90% of NGO trainees can fully describe the 3 key
40% advocacy tactics
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Guidelines for Developing Criteria for Selecting Best
Indicators •
Indicators
Does the indicator measure the result (accuracy,
• Brainstorm all possible indicators and then
certainty, & exactness)?
select best
• Is it a consistent measure over time?
• Limit the number of indicators (2 to 3 per • When the result changes, will the indicator be
result) sensitive to those changes?
• Measure realization and enjoyment of rights • Will it be easy to collect and analyze the
information?
• Always sex-disaggregate • Can you use the data the indicator provides for
• Develop in participatory fashion decision-making and learning?
• Indicators must be relevant to needs of the user • Can you afford to collect the information?
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
BASELINE, TARGETS , PERFORMANCE
Commitment
The concept of performance
indicator includes the
notions of:
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Strengths Weaknesses
• (Almost) Everything you need to know • Causal logic hard to follow in LFA
about project in one place • The first column is a bit redundant
• Indicators easily line up with results • Hard to represent outputs that are inputs to
• Assumptions at each stage of causal logic other outputs
are clear • Some but not all of the data collection
• Risks are identified information
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
Performance Measurement
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
1
PMF Data Gathering Questions An Example of PMF : Planning
for Monitoring
• From what source will the data be
Expected Indicators Means of Collection methods Baseline (with Responsibilities
collected? Results measurement/
verification
(with indicative
timeframe &
indicative
timeframe)
frequency)
• What methodology will be used? (see
From LFA From LFA
handout in binder) Where is How are you What is the Who is going
the data going to get starting to get it
• How often will the data be collected? and when point
The presentation is based on UNIFEM materials The presentation is based on UNIFEM materials
2
Performance Information
• Performance measurement should
yield good information that can be
Reporting and Learning used during project implementation
to adjust strategies
• If performance information is not
Module 8
being used actively, then RBM is not
being applied!
Why Reporting?
• Links evaluation
• To learning
• Feeds back into programming
Risks:
Political instability in Southeast Europe and Ukraine might prevent attention to gender equality
in general, and to GEL-WRI implementation in particular. Immediate/ short-term political
priorities do not include gender equality and women’s right issues in employment and
workplace
There is lack of reliable statistics to measure changes/ progress/ regress that could better
inform decision-making for improved implementation of GEL-WRI.
Outcome 1: Improved CSO advocacy and Risks:
outreach services for GEL-WRI implementation
Political climate in Southeast Europe might not be able to support a space for sustained NGO
advocacy fully
Risk: Access to relevant mechanisms is limited for some groups of women (ethnic groups,
rural women, low skilled/ low educated women, poor women)
Output 1.1 Increased advocacy skills of 100
CSOs and 25 lawyers on GEL-WRI
implementation
Risks and Assumptions
Advancing Implementation of Gender Equality Legislation in Employment and the Public Workplace
Expected results statement Assumptions/Risks
Activities
1.1.1 Design of training, based on needs
assessment, for 100 CSOs and 25 lawyers on
advocacy skills for GRL-WRI implementation
1.1.2 Conduct of training for 100 CSOs and 25
lawyers on advocacy skills for GRL-WRI
implementation
Output 1.2 Increased knowledge and skills of
10 CSOs and 25 lawyers to provide outreach
services to women whose rights on WRI are
not protected or are violated
Activities
1.2.1 Design of training, based on needs
assessment, for 10 CSOs and 25 lawyers on
provision of outreach services to women on
GEL-WRI issues
1.2.2 Conduct of training for 10 CSOs and 25
lawyers on provision of outreach services to
women whose rights on WRI are not protected
or are violated
Outcome 2: More effective GEL-WRI Risks:
implementation by ministries and judiciary in
Croatia, Bosnia & Ukraine Governments have difficulty in providing and sustaining adequate resource allocation to enable
relevant ministries and the judiciary to implement programmes in support of GEL-WRI
implementation.
Geographical distribution/ quality of these services at local level is uneven causing problems
with access to services for some groups of women
Output 2.1 Increased capacity of relevant
ministries and judiciary for GEL-WRI
implementation
Risks and Assumptions
Advancing Implementation of Gender Equality Legislation in Employment and the Public Workplace
Expected results statement Assumptions/Risks
Activities
2.1.1 Review of existing provisions of GEL-WRI
including responsibilities of ministries and
judiciary
2.1.2 Design of training for better
understanding of GEL-WRI issues and
importance of resource allocation for GEL-WRI
implementation by relevant ministries and the
judiciary
2.1.3 Conduct of training for relevant ministries
and 200 public prosecutors and judges
2.1.4 Provide technical support to relevant
ministries in developing new services for
women in line with their responsibilities to
implement GEL-WRI
2.1.5 Develop, publish and update knowledge
base on GEL and CEDAW implementation
relating to WRI
Output 2.2 Increased capacity of relevant
ministries and the judiciary to ensure their
accountability for GEL-WRI implementation
Activities
2.2.1 Develop monitoring & reporting tools to
assess GEL-WRI implementation by relevant
ministries and the judiciary
2.2.2 Analyze reports
2.2.3 Disseminate reports
Sample LFA HBW_ENG.doc
Annex I
Logical Framework Analysis
Phase II: Strengthening Organizations of Home Based Workers in South and Southeast Asia
GOAL: Ensuring the full realization of human rights of women home-based workers (HBWs) in Asia.
OUTCOME 1: INDICATORS: MEANS OF VERIFICATION (MOVs):
Existence of sustainable • HomeNets South and Southeast Asia exist. • Constitutions/Annual reports of each sub-
organizations of HBWs and • Increased visibility of HBWs and their networks. regional and national HBW network.
their networks at national and • HBW networks are implementing programmes. • Extent of participation of HBW networks in
sub-regional levels in South • HBW networks are meeting membership needs and demands. consultations/ policy forums etc.
and Southeast Asia. • Growing membership within HBW networks. • Consult HBW Networks’ work plans
• Financial resources for the sub-regional and national HBW networks are forthcoming. • Feedback on HBW networks’ programmes.
• Institutional procedures, systems and mechanisms exist within the sub-region and national networks. • Comparison of current membership numbers
• Recognition and inclusion of HomeNets in national, regional and international forums. to that of previous years.
• Donor commitments/ expression of interest/
agreements
• References to HBW Networks in media and
government reports.
OUTPUTS INDICATORS MOVs ACTIVITIES
1. Strong, representative, • HomeNet Southeast Asia is legally • Certification of legal 1.1. Legally establish HomeNet Southeast at its new base in Manila and
financially sustainable registered. registration for HomeNet HomeNet South Asia (in Delhi) with appropriate structures in place
networks are legally • Autonomous and consolidate HomeNet Southeast Asia. (such as a bank account and sound accounting capacities) to
established which are South Asia exists. • Government reports undertake its mandate.
able to successfully • HBW national networks participate in skills • NGO reports. 1.2. Support the election of HBW Network governing body representatives
achieve their mandates enhancement workshops. • Reports and documents in all 8 countries and 2 sub-regions.
at the sub-regional and • National governments recognize the generated by HBW 1.3. Support the professionalisation and organizational development of
national levels in South national HBW networks by soliciting their networks. the HBW networks in all 8 countries through training in
and Southeast Asia. feedback, inputs, & participation on issues • Minutes of government organizational, personnel and project management, leadership,
affecting HBWs and/or the informal sector. meetings, taskforces, networking, lobbying, resource mobilization, etc. in an effort to
• HBWs and associated groups continue committees. develop the skills required to manage their own organizations.
their membership with the sub-regional • Feedback from 1.4. Conduct entrepreneurship and enterprise development training for
and national networks. government officials. organized home workers’ groups in both sub-regions and all 8
• Increased ownership of networks by countries.
HBWs. 1.5. Support HomeNets South Asia and Southeast Asia in establishing
• Increased representation of HBWs by appropriate policies, procedures, and mechanisms to guide
HBWs themselves at national and operations of the network, such as membership procedures, and
international levels. mechanism for decision-making and representation.
1.6. Forge linkages at local, national and regional levels with
organizations such as trade unions, Chambers of Commerce,
corporate houses, academics, women’s groups and government
agencies.
Sample LFA HBW_ENG.doc
2. HBWs in both sub- • Increased number of members in sub- • Consult membership 2.1 Strengthen expansion efforts among home workers’ groups in Laos.
regions expand their regional and national HBW networks. database in all countries. 2.2 Conduct exploratory discussions with HBWs and associated groups
regional and national • New countries explored for potential • Reports from exploratory in Vietnam and Cambodia (funding permitted) and in parts of South
membership bases membership. missions. Asia.
towards institutional • New HomeNet is established in Lao. 2.3 Continue mapping the HBW sector and undertake research initiatives
sustainability. at both sub regional and national levels in South and Southeast Asia,
in an effort that the results contribute to greater visibility, legislative
reform and empowerment of HBWs.
3. Sub-regional and • RM strategy exists and is implemented for • Physical observation that 3.1 Support HBW sub-regional and national networks in developing
national HBW networks each of the HBW networks. RM strategy exists. linkages with NGO, government, bilateral and multilateral partners,
have skills in resource • Increase in amount of resources that are • HBW Networks’ budgets liaison and networking.
mobilization (RM) in mobilized from sources other than (and sources of funds). 3.2 Support the HBW sub-regional and national networks in developing a
order to build towards UNIFEM. ‘resource mobilization strategy’ in all 8 countries.
their own financial • Trainings in RM take place.
sustainability. • Staff/members of HBW networks
participate in trainings offered.
4. Knowledge sharing, • Tangible links between HomeNets South • Feedback from training & 4.1 Support sub-regional and national networks in all 8 countries in
networking and cross- Asia and Southeast Asia exist. Study tour participants. information and cross-regional learning through regular updating of
regional and national • Inter and intra regional study visits, • HBW Network reports the HomeNet Southeast Asia website, HomeNet South Asia’s and
learning contribute to trainings and research take place. • HomeNet website counter Southeast Asia’s newsletter, study visits, documentation and
enhanced capacities of • Regional databases on HBWs and • Feedback from HBW dissemination of best practices, tools, manuals, and resources.
sub-regional and associated groups exist. Network members, 4.2 Conduct sub regional and national workshops on social protection,
national networks. • Number of members that benefit from inter partners, donors on lobbying, advocacy, networking, marketing, information technology
and intra-regional knowledge sharing and effectiveness of HBW (e.g., e-commerce), and other felt needs of home worker groups in
networking events. networks. both sub-regions.
• Number of ‘hits’ on HomeNet website • Project progress and 4.3 Support the development and dissemination of a directory of HBWs
• Increased organizational effectiveness of evaluation reports. and associated organizations in South Asia.
HBW networks.