CIMES Handbook
CIMES Handbook
County Integrated
Monitoring and
Evaluation System
Version 1, March 2016
CIMES
THE PRESIDENCY
Ministry of Devolution and Planning
Monitoring and Evaluation Department
COUNCIL OF GOVERNORS
Published by the Government of Kenya in March 2016
© 2016 Government of the Republic of Kenya,
Ministry of Devolution and Planning
Guidelines for the Development of
County Integrated
Monitoring and
Evaluation System
(CIMES)
Foreword
The legal mechanisms spelt out in the Constitution of Kenya, have necessitated the develop-
ment of Monitoring and Evaluation (M&E) systems for county governments. The Constitution
requires adherence to principles of good governance and transparency in the conduct and
management of public programmes/projects. For devolution to succeed in Kenya, county and
national governments are united, in the recognition that performance monitoring and evalua-
tion are pivotal development and service delivery tools for leaders at all levels. Thus the focus
of both county and national governments is increasingly on development results and how
they can best be measured.
The Ministry of Devolution and Planning (MoDP) and the Council of Governors (CoG) are com-
mitted to developing as centres of excellence in performance management for public service
delivery. By coordinating our efforts, we intend to accelerate progress in the counties to achieve
a high quality of life for all Kenyans. We also intend to create a strong feedback mechanism that
will regularly provide county residents with good quality and timely monitoring and evalua-
tion (M&E) information regarding implementation progress of flagship development projects
and programmes.
These guidelines are primarily intended to assist staff in the design and implementation of
M&E plans for the policies, projects and programmes in the County Integrated Development
Plan (CIDP) being implemented in each of the 47 counties. The guidelines will also serve as a
useful reference for staff of the national government, public agencies, commissions and other
institutions involved or interested in the design, implementation, monitoring or evaluation of
CIDP financed by the national government or devolved funds.
These County Integrated Monitoring and Evaluation System (CIMES) guidelines have been de-
veloped after extensive consultations and dialogue with relevant stakeholders. Their imple-
mentation across all counties will also improve the co-ordination of development planning,
policy formulation and delivery of development and public services by strengthening per-
formance management mechanisms for county and national governments. The use of these
guidelines also prepares counties for electronic support for CIMES.
These guidelines were developed as part of National Capacity Building Framework (NCBF) that
will align and guide ongoing capacity building and mobilizing new resources for devolution.
The implementation of these guidelines requires change in the mind-set and approach of staff.
They place a high premium on coherent and long-range planning around results; involving
county citizens in the design and implementation of development programmes and projects.
They emphasize partnerships with key stakeholders for development change; capacity build-
ing for improving and enhancing ownership of M&E activities at county level; and the promo-
tion of knowledge and learning through the use of county M&E reports.
We are confident, that these guidelines are a useful tool for those responsible for monitoring
and evaluation in the counties, and that they will contribute to the acceleration of service de-
livery and an improvement in value-for-money across county governments.
iv
Acknowledgements
This document, has been developed through consultations and stakeholder workshops
on County M&E Guidelines organised by personnel from the Monitoring and Evaluation
Department (MED) of the Ministry of Planning and Devolution in collaboration with the Council
of Governors.
The document draws upon detailed discussions with County Governors, deputy governors
and various county officials and is informed by meetings with Independent Commissions in-
cluding; the Commission on Revenue Allocation, the Controller of Budget, the Kenya National
Audit Office, the Kenya School of Government, the Commission on the Implementation of
the Constitution, the Senate, National and County Government Ministries, the Performance
Contracting Department, the Ministry of Devolution and Planning and the Monitoring &
Evaluation Department (MED). The content draws on the constitutional and legal mandates,
regulations, acts, policies and identified needs and challenges of these stakeholders.
This document is informed by the Constitution of Kenya, the County Governments Act (2012)
and the Public Financial Management Act (2012). The content also draws on reviews of County
Integrated Development Plans (CIDPs), discussions with MED, review of MED training materials
and of the CIDP Guidelines. It has greatly benefited from District M&E Guidelines from Ghana,
Budget Service Delivery Implementation Plan Guidelines from South Africa, and the published
performance management policy and practice of Nelson Mandela Bay Municipality, South
Africa. This document is consistent with the spirit of Vision 2030 and the Constitution of Kenya.
Good practice from existing documents has been used where possible. The documents used
are presented in the references section.
The Monitoring & Evaluation Directorate of the Ministry of Planning & Devolution and the
Council of Governors, acknowledge the support of Gaiasoft International and the World Bank’s
Kenya Accountable Devolution Programme with funding from the Department for International
Development (DFID), the Australian Department of Foreign Affairs and Trade (DFAT), the
European Union (EU), Finland, represented by its Ministry of Foreign Affairs, Swedish International
Development Cooperation Agency (SIDA), and the United States Agency for International
Development (USAID).
v
Abbreviations
vi
Table of Contents
Foreword..............................................................................................................................................................iv
Acknowledgements..........................................................................................................................................v
Abbreviations.....................................................................................................................................................vi
1. Introduction..............................................................................................................................................................3
1.1. Background................................................................................................................................................ 3
1.2. Purpose of the Guidelines................................................................................................................... 3
1.3. The Structure of the Devolved Government ................................................................................ 4
1.4. Current Status of County M&E Systems........................................................................................... 7
1.5. Justification of Monitoring and Evaluation of CIDP Outcomes/Results ............................. 7
1.6. Linking CIMES to Statistics and County Performance Management Systems................. 8
1.7. Objective and Purpose of the Guidelines....................................................................................... 9
6. Operationalising CIMES...................................................................................................................................... 51
6.1 M&E Core Indicators.............................................................................................................................52
6.2 Resources for Monitoring and Evaluation....................................................................................53
vii
A1 Core County Result Indicators.......................................................................................................... 55
Core County Indicators for Monitoring the Implementation of the
County Budget and Value-for-Money...................................................................................................... 58
A2 Readiness Checklist.............................................................................................................................. 60
A3 CIDP Checklist........................................................................................................................................ 61
A4 Annual Development Plan Checklist............................................................................................. 62
A5 Engagement Checklist........................................................................................................................ 63
A6 Stakeholder Participation Assessment......................................................................................... 64
A7 Maturity Model for Reporting Status of ADP Projects............................................................. 65
A8 Template for CIDP and ADP Performance Management Results Matrix............................... 66
A9 Selection Criteria for Performance Management M&E System........................................... 70
A10 Project Sheet or Project Logical Framework Matrix (LFM)..................................................... 71
A10 Project Sheet or Project Logical Framework Matrix (LFM) continued............................... 72
Project M&E Using Project Sheets............................................................................................................. 73
A11 M&E Reporting Sheet Aggregated by (Sub-)Sector/Project Type....................................... 74
A12 Targets in Project Sheets and Results Matrix.............................................................................. 75
A13 County Governments Administrative Structure Based on Officers.................................... 76
A14 Key Reports to be Prepared at County Level.............................................................................. 77
A15 NIMES Operational Arrangements................................................................................................. 78
References.......................................................................................................................................................... 79
viii
Basic Concepts and
Terminology
These guidelines deal with a range of concepts and terms which may need clarification for those
unfamiliar with the process of designing a programme or project monitoring and evaluation system.
The definitions and concepts offered below are mainly those used by the Organization for Economic
Cooperation and Development (OECD), the United Nations and the World Bank’s Africa Region
Evaluation Groups.
1. Monitoring: Monitoring is the process of collecting, analysing, and reporting data on a project
or programme’s inputs, activities, outputs, outcomes and impacts, as well as external factors
to track whether actual investment programme results are being achieved. These data, when
analysed, pinpoint progress or constraints as early as possible, allowing managers to adjust
project or programme activities as needed. Monitoring aims to provide managers, decision
makers and other stakeholders with regular feedback on progress in the implementation of
activities specified in the development plans.
2. Evaluation: Evaluation is a systematic and objective assessment of an ongoing or completed
project, programme or policy, its design, implementation and results. An evaluation determines
the relevance and fulfilment of objectives, efficiency, effectiveness, impact and sustainability.
Evaluation is linked to monitoring. Monitoring provides the basis for evaluation, which involves
answering two questions: “Has the project or programme activity met its objectives?” and “What
accounts for its level of performance?” Evaluation tells managers whether project/programme
activities are moving toward or away from project/programme objective or management goals,
and why. It provides lessons learnt and recommendations for future improvements.
3. Indicators: An indicator is a measure that can be used to monitor or evaluate an intervention.
Indicators can be quantitative (derived from measurements associated with the intervention)
or qualitative (entailing verbal feedback from beneficiaries).
4. Performance vs. Impact Indicators: Project or programme monitoring and evaluation in-
volves two kinds of indicators: implementation performance indicators (project/programme
inputs and outputs) and project impact indicators (achievement of objectives in relation to so-
cio-economic development). Implementation performance indicators measure the progress
in securing project inputs and delivering project outputs against set targets, while project
impact indicators measure the consequence (the “So what”) of implementation.
5. As will be observed in a subsequent section of this document, M&E revolves around a number
of other key elements:
(a) Inputs: Inputs are all the resources that contribute to the production of service delivery
outputs. Inputs are “what we use to do the work”. They include finances, personnel, equip-
ment and buildings.
1
County Performance Management System Handbook
(b) Activities: These are the processes or steps one takes to reach the project’s or pro-
gramme’s objective. They are written in the sequence or order in which they will be im-
plemented. Each activity completed brings one closer to achieving the project objective.
(c) Outputs: These are the final products, goods or services produced for delivery. Outputs
may be defined as “what we produce or deliver”.
(d) Outcomes: The medium-term results for specific beneficiaries which are the consequence
of achieving specific outputs. Outcomes should relate clearly to an institution’s strate-
gic goals and objectives as set out in its plans. Outcomes are “what we wish to achieve”.
Outcomes are often further categorised into immediate/direct outcomes and intermedi-
ate outcomes.
(e) Impacts: Impacts are about “how we have actually influenced communities and target
groups”. The results or consequences of achieving specific outcomes, such as reducing
poverty or creating jobs.
(f ) Results are the outputs, outcomes, or impacts, either intended or unintended, positive
or negative of a development intervention. The Government only encourages results
that support sustainable improvement in the country’s outcomes – bringing real positive
changes in poor people’s lives.
2
1
Introduction
Introduction
1
1.1. Background
1. Kenya has undertaken development planning since it gained independence from Britain in
December 1963. However, due to the non-existence of an integrated monitoring and evaluation
(M&E) system, execution of the development plans over the first four decades of independence was
weak. Complaints of non-implementation, or non-completion, of highly ambitious projects were
common. Information collection, analysis and reporting of results was undertaken in an ad hoc man-
ner. Decision-making and feedback at the local level was seldom based on verifiable evidence in the
absence of a comprehensive M&E system.
2. Efforts were made to establish individual project- and programme-based M&E in the country in
the 1980s and 1990s. Most development plans prepared during this period included a section on M&E.
However, most of these M&E plans were prepared in response to donor demands, leading to very spe-
cific project and programme evaluations. As a consequence of the dominance of donor requirements,
the M&E reports produced were rarely shared with the intended project/programme beneficiaries.
3. Development of an integrated M&E system in Kenya began in 2000 with the implementation of
the Interim Poverty Reduction Strategy Paper (I-PRSP) 2000-2003. It was enhanced during the imple-
mentation of the Economic Recovery Strategy for Wealth and Employment Creation (ESRWEC), 2003-
2007. The Investment Programme for the ERSWEC recognised the important role of M&E in promoting
accountability and enhancing good governance issues. This resulted in the development of the National
Integrated Monitoring and Evaluation System (NIMES) in 2004; and the creation of the Monitoring
and Evaluation Directorate (MED) in the Ministry of Planning and National Development. Most of the
NIMES activities have been concentrated at the national level, with little emphasis on tracking sub-
national project/programme interventions1.
4. With the introduction of NIMES, M&E has in the past decade become an integral part of the
policy formulation and implementation process at the national level in the past decade. The output
of the NIMES process is used for, amongst other purposes, informing national development planning
and policy dialogue within government and with private sector and civil society organisations and
development partners.
3
1
Introduction
senior management staff within a county. It verifies whether the activities of each county’s priority
project or programme are happening according to planning timelines and targets presented in the
County Integrated Development Plan (CIDP); and whether resources are being used in a correct and
efficient manner.
6. Disseminating M&E results can raise awareness of a county’s programme and projects among
the general public and help build positive perceptions about the county’s leadership; and this may
lead to increased resource allocation towards the well performing counties. The system will supply
the county with a regular flow of information throughout the course of CIDP programme implemen-
tation, to make it possible to detect changes in status and utilisation of resources allocated to CIDP
priority projects or programmes.
7. These guidelines are primarily intended to assist staff who are involved in the design and im-
plementation of monitoring and evaluation (M&E) plans for CIDP and other projects and programmes
that are being implemented in the 47 counties. It is anticipated that the guidelines will also serve as
a useful reference for staff of the national government, public agencies, commissions and other insti-
tutions that are either involved or interested in the design, implementation, monitoring or evaluation
of CIDP and other programmes and projects being financed by the national government or devolved
funds within each county.
4
1
Introduction
County Performance Management System Handbook
Structure
of
the
Devolved
Government
Constitution
of
Kenya
National
County
President Government Government Governor
County
Executive
Cabinet Legislature
Committee
The
national
government
is
primarily
responsible
for
a
The
county
government
is
responsible
for
policy
bulk
of
the
policy
m aking
functions
on
matters
cutting
making
functions
on
m atters
relating
to
the
county
like
across
the
nation,
including
m aking
policies
relating
to
agriculture,
education,
etc.
and
for
implementation
of
agriculture,
veterinary,
fishing,
health,
education,
specific
national
government
policies, ensuring
and
energy,
housing,
tourism,
labour standards,
coordinating
the
participation
of
communities
and
international
trade
and
foreign
affairs. locations
in
governance
at
the
local
level.
Prepared
by
Gaiasoft for
t he
Monitoring
and
Evaluation
Department,
Ministry
of
Devolution
&
Planning.
Structure of the Devolved Government
educational institutions and other institutions of research and higher learning, and primary schools,
special education and secondary schools and special education institutions, setting of education
standards, curricula, examinations and the granting of university charters.
11. Other functions of the national government include: the use of international waters and water
resources; immigration and citizenship; promotion of sports; transport and communications, includ-
ing road traffic and construction and operation of national trunk roads, railways, pipelines, marine
navigation, civil aviation, postal services, telecommunications, and radio and television broadcasting;
national public works; protection of the environment and natural resources; national referral health
facilities; national disaster management; national elections; capacity building and technical assis-
tance to the counties; and public investment.
County
governments
were
established
in
March
2013
after
the
first
general
elections
under
CoK
2010.
The
constitution
requires
county
governments
to
plan
and
budget
for
the
delivery
of
goods
and
services
under
their
mandate through
the
following
plans:
1 2 3 4 5
County
County
Sectoral County
County
Spatial
Cities
and
Integrated
Plans Performance
Plan Urban
Area
Development
Management
Plans
Plan
(CIDP) Plans
In
addition
to
CIDP,
every
county
must
develop
an
Annual
Development
P lan
(ADP)
and
Annual
Fiscal
Strategy during
the
CIDP
period.
3 See CoK 2010, Article 189 on cooperation between national and county governments
4 The Kenya Vision 2030 is the country’s development blueprint covering the period 2008-2030.
5 The Kenya Vision 2030 is to be implemented in successive five-year medium-term plans, with the first such plan covering the
period 2008–2012, the current MTP is the second, covering period 2013-2017.
6
1
Introduction
County Performance Management System Handbook
financial and institutional resources to agreed policy objectives and programmes; unification of plan-
ning, budgeting processes; undertaking regular CIDP implementation progress and performance re-
views; and promoting public participation in preparation and implementation of government devel-
opment programmes and projects.
18. In addition to CIDP, every county must develop an annual development plan (ADP) and annual
fiscal strategy during the CIDP period. The CIDP and ADP should integrate the projects and services of
devolved and un-devolved functions, as well as the projects and services of CDF, other devolved funds
and non-government organizations undertaking investment programmes within the county. These
guidelines mainly refer to the monitoring and evaluation of the CIDP, Constituency Development
Fund (CDF), other devolved funds as well as programmes and projects that are funded by develop-
ment partners and other key stakeholders with investment programmes within a county.
7
1
Introduction
and water supply) of rural development, the national government and each county government need
to develop an integrated M&E system. This will be used to track implementation progress for invest-
ment programmes outlined in the MTP and CIDP and other projects and programmes financed by
devolved funds, development partners and CSOs. These M&E systems usually include important so-
cial and economic indicators as well as targets used to monitor the Sustainable Development Goals
(SDGs) and key indicators related to economic growth, poverty, education, health and infrastructure
activities that are being implemented at national and county levels.
25. At county level, tracking progress towards the achievement of the policies, projects and pro-
grammes outlined in each CIDP will be undertaken through the County Integrated Monitoring and
Evaluation System (CIMES). Analysis of CIMES results will demonstrate whether the resources spent
on implementing CIDP investment programmes are leading to the intended outcomes, impacts and
benefits for the county population. In this way, the CIMES will also provide essential feedback to the
county budgetary allocation and execution processes, thereby ensuring that future county budget
preparation and execution processes are tailored towards maximising their impact on achieving CIDP
targets. A CIMES will also serve as a vehicle for building partnerships within county governments, and
between national and county governments, the private sector, civil society and external develop-
ment partners. The system will also improve stakeholder communication and help in building agree-
ment on desirable poverty reduction outcomes and strategies. Like the CIDP, which was prepared
through a consultative process, the development of the CIMES should involve all key stakeholders in
the county.
2 3
County
Other
county
statistics
offices departments
Linking CIMES to Statistics and County PMS Prepared by Gaiasoft for t he Monitoring and Evaluation Department, Ministry of Dev
county employees) located within every county, the system will provide a suitable vehicle through
which Governors and County Commissioners will prepare combined reports for presentation to
The Intergovernmental Forum, the Council of Governors, the National and County Government
Coordinating Summit and the Senate. The Performance Management System and M&E resultant
reports shall be made public. Devolved M&E, through implementation of the Annual Development
Plan (ADP), enables local accountability, local corrective action and local learning, resulting in the
fast-tracking of local development and results.
30. CIMES provides an integrated structure and process for counties to engage stakeholders, plan,
govern, manage and operate independently and yet in synch with one another. Counties can opera-
tionalise CIMES by adopting these M&E guidelines and connect electronically through dissemination
with the Council of Governors, other counties and with national government.
9
1
Introduction
activities and external factors; and (vi) demonstrate how activities for CIDP programmes and projects
will lead to desired outcomes and impacts.
32. The Guidelines also provide stakeholders and participants involved in preparation and use of
county M&E system with practical steps to operationalise the M&E section (i.e. Chapter 8) of the CIDP
with respect to implementation, through the monitoring and evaluation of indicators and targets for
various projects and programmes included in the CIDP and the devolved funds.
33. The Guidelines have been prepared through extensive collaboration and consultations with all
key stakeholders, including representatives of the national and county governments, and develop-
ment partners. The preparation has taken into account the relevant sections of the Constitution of
Kenya and the subsequent laws relating to devolution and to international best practice.
10
Constitutional, Legal and
2 Policy Frameworks for
County M&E
2
Policy Frameworks
Constitutional, Legal and
34. This chapter outlines the legal foundation for the compilation of M&E activities in the devolved
system of government. The legal framework for the CIDP and M&E of CIDP implementation is based
on the Constitution of Kenya and its supporting legislation and policy documents listed below.
11
County
M&E
Frameworks
County Performance Management System Handbook
County
M&E
2 Policy Frameworks
Constitutional, Legal and
Prepared
by
Gaiasoft for
t he
Monitoring
and
Evaluation
Department,
Ministry
of
Devolution
&
Planning.
County M&E Frameworks
appropriate; (c) monitoring the implementation of national and county development plans and rec-
ommending appropriate action; (d) coordinating and harmonising the development of county and
national government policies; (e) consideration of reports from other intergovernmental forums and
other bodies on matters affecting national interest; and (f ) consultation and co-operation between
the national and county governments.
40. Section 19 of the Intergovernmental Relations Act established a Council of County Governors
consisting of the Governors of the 47 counties. Functions of this council are stipulated in Section 20.
The council provides a forum for: (a) consultation among county governments; (b) sharing of infor-
mation on the performance of the counties in the execution of their functions with the objective of
learning and promoting best practice and where necessary initiating preventive or corrective action;
(c) considering matters of common interest to county governments; (d) facilitating capacity building
for Governors; (e) receiving reports and monitoring the implementation of inter-county agreements
on inter-county projects; (f ) considering reports from other intergovernmental forums on matters
affecting national and county interests or relating to the performance of counties.
41. The Public Finance Management Act, 2012 (PFMA 2012) PART IV, addresses county govern-
ment responsibilities with respect to management and control of public finance. Section 104 states
that a County Treasury shall monitor, evaluate and oversee the management of public finances and
economic affairs of the county government. The county government shall plan for the county and no
public funds shall be appropriated outside of a planning framework developed by the county execu-
tive committee and approved by the county assembly. Section 125 sets out the stages in the county
government budget preparation process, and Section 126 requires that the county government pre-
pares a development plan.
42. PFMA 2012, Section 104, defines the responsibility to monitor, evaluate and oversee the man-
agement of public finances and economic affairs of the county government, including the monitoring
of the county government’s entities to ensure compliance with this Act and effective management of
their funds, efficiency and transparency and, in particular, proper accountability for the expenditure
of those funds; and reporting regularly to the county assembly on the implementation of the annual
county budget.
12
County Performance Management System Handbook
43. Additional functions of the County Treasury include: (a) monitoring the county government’s
entities to ensure compliance with this Act; (b) upon request, providing the National Treasury with infor-
mation which it may require to carry out its responsibilities under the Constitution of Kenya and this Act;
and (c) reporting regularly to the county assembly on implementation of the annual county budget.
2
2.3 Policy Framework
Policy Frameworks
Constitutional, Legal and
44. The Kenya Vision 2030 outlines the national level long-term objectives of the country, in par-
ticular to the achievement of middle-income status by 2030. A series of 5-year Medium Term Plans
(MTPs) translates this long-term objective into medium-term priorities, objectives, and programmes.
The government’s efforts towards successful achievement of these priority programmes is tracked
through key performance indicators included in the M&E sections of CIDPs and MTPs.
2.3.1 NIMES
45. The MDP is mandated to implement NIMES as part of the governance reforms of the national
government. NIMES is aimed at strengthening governance by; improving transparency, strengthen-
ing accountability relationships, and building a performance culture within the two levels of govern-
ment to support better policymaking, budget decision-making and management. It is designed to
ensure regular reporting on implementation progress of the country’s priority policies, projects and
programmes outlined in key policy documents such as MTPs, CIDPs, devolved funds programmes, the
National Accountability Management Framework, and Performance Contracts and the Performance
Appraisal System. It is also designed to report on the Government’s commitments to other inter-
national frameworks such as the Sustainable Development Goals (SDGs), the New Partnership for
African Development’s (NEPAD), and the African Peer Review Mechanism (APRM). The operational
arrangements of NIMES are presented in Annex A16.
13
County Performance Management System Handbook
entities report directly to parliament, which represents the people of Kenya. To receive a holistic view
of developments in the country, MDP and the line ministries will compile the different reports to
reflect the national government’s performance and progress towards Vision 2030 and within it the
MTPs.
48. A NIMES computerised platform is recommended in the M&E framework. It will serve as a plat-
form for data collection, analysis, publication and dissemination, building on aggregate M&E data
2
collected automatically through interfaces with the various management information systems that
Constitutional, Legal and
Policy Frameworks
are being established at sector ministries as well as the National Treasury and MDP. This platform will
improve the access, timeliness and publication of national data, and will also serve as a useful data-
base for research and analysis in the country. In addition, the dissemination and utilisation of data and
a communications strategy for M&E data and analyses are included in the M&E framework.
49. Monitoring and evaluation requires competent professionals in the public service. The frame-
work describes competencies needed for M&E staff, and steps that are being taken by the govern-
ment to develop curricula and advanced training in M&E for use by local universities and other train-
ing institutions. In essence a core M&E curriculum at university level will be defined in collaboration
with the institutions, to serve as a basis for universities’ development of suitable M&E programmes.
In-service training courses will also be made available, tailored for different stakeholders and target
groups, covering both awareness raising and in-house training programmes for managers and for
M&E and other programme staff.
missions include: the Teachers Service Commission; the National Police Service Commission; the Kenya National Human Rights
and Equality Commission; the Parliamentary Service Commission; the Independent Electoral and Boundaries Commission; the
Commission on Revenue Allocation; the Judicial Service Commission; the Public Service Commission; the Salaries and Remu-
neration Commission; and the National Land Commission. The independent offices have been enlisted, such as the Controller of
Budget and the Kenya National Audit Office.
14
Building a Sustainable
3 M&E System and
Strengthening Capacity of
the Counties
53. This chapter describes general principles of M&E, and ten M&E steps consistent with global best
practice in M&E design and implementation. The guidelines build upon the training that was provid-
ed by MED to the counties during 2014-2015. The Guidelines should be read together with the vari-
ous forms that are presented in the appendixes of this document. The forms in the appendixes have
been developed through hands-on experience and consultations with relevant staff in the counties.
They have been found to be suitable for use at present. The forms should however be considered as
3
work-in-progress, and they may be modified over time to make them more appropriate for setting
M&E System
Building a Sustainable
standards and norms for the evolving county M&E reporting needs. The M&E principles, the ten steps
and the forms in the appendixes will be supported by a forthcoming M&E curriculum and e-learning
materials to be developed in partnership with CoG, MED and the Kenya School of Government, local
universities and colleges.
9 Kusek and Rist, Ten Steps to a Results-based Monitoring and Evaluation System, World Bank, 2004
15
County Performance Management System Handbook
Monitoring Evaluation
(a) Ensure that monitoring is involved at all (a) Ensure that clear targets are identified at the
stages of the programme or project design start of the project/programme implementation
and implementation. process, and that delivery against these targets
is used as the main framework for evaluation.
(b) Involve all stakeholders in monitoring
activities, and ensure that there are (b) Incorporate a clear framework (such as a Results
incentives in place for them to engage Matrix and Gantt chart) in the design of the
therein. project or programme to provide the basis for
subsequent evaluation.
(c) Create an environment in which monitoring
is perceived as beneficial both to individual (c) Make provision for costs of evaluation in original
performance and to organisational capacity. budget.
(d) Use a diversity of methods, including both (d) Ensure that all stakeholders, and particularly
qualitative and quantitative indicators. the intended beneficiaries, are consulted in
3
identification; organisation analysis; strategy formulation and identification and selection of imple-
mentation options; and these are captured in the relevant chapters of the CIDP.
58. These analyses are thereafter summed up in a Results Matrix or Logical Framework Matrix (LFM).
In this case, the CIDP Results Matrix of Form A8 (see Appendix A8) and the Project Sheet for each CIDP
project is captured in a Project Sheet – Form A10 (see Appendix A10). The LFM gives a time-frame
which identifies how much time will be needed for each activity. It identifies when an activity can
begin and by when it must be completed. The LFM of the CIDP (A8) and projects (A10) has “Indicators”
built into it, thus there is an “automatic” connection between project design and the M&E system.
59. The analysis in preparing the Annual Development Plan (ADP) follows the performance/pro-
gramme based budgeting, which allocates budget expenditure by programme. For effective man-
agement and good governance of public funds, this programme-based budget should preferably be
managed through IFMIS. As required in the CIDP Results Matrix (for CIDP projects in Chapter 7) and
captured in Form A8, targets, indicators and objectives are included in the LFM or the Results Matrix,
16
County Performance Management System Handbook
and any actions/activities necessary to undertake M&E should be included in the project plan and
budget.
60. To ensure simplicity and ease of compliance, CIDP projects should be monitored and data col-
lected at least quarterly, using the Project LFM form presented in Appendix A10. M&E is a dynamic
process and one of the characteristics of an effective M&E system is that it allows for changes in pro-
ject implementation based on results, evidence and evaluation.
3
The counties are expected to demonstrate their preparedness and capability to understand and fol-
M&E System
Building a Sustainable
low these ten steps in building their respective CIMES. A summary outlining each of these steps is
provided below.
Figure 1. Ten Steps to Designing, Building, and Sustaining a Results-Based Monitoring and
Evaluation System
1 2 3 4 5 6 7 8 9 10
the ADP Checklist in Appendix A4 to test the suitability and readiness of the CIDP and the APD for
implementation.
64. Is there demand from the county executive, from the citizenry? Is the legal framework in place?
Is there demand from funding partners? Is there ownership by county managers and staff who will be
involved in implementation? It is important to know where the demand for creating an M&E system is
emanating from and why. Are the demands and pressures coming from internal, multilateral or inter-
national stakeholders, or a combination of these? These requests will need to be acknowledged and
addressed if the response to demand is to be appropriate.
65. It is important to understand the value of M&E to the county executive as well as to citizens and
development partners. They can all play an important part in creating demand for transparency and
accountability, which will in turn create strong demand for M&E reports of the project or programme.
66. The readiness assessment includes a review of a country’s current capacity to monitor and eval-
uate along the following dimensions: technical skills; managerial skills; existence and quality of data
systems; available technology; available fiscal resources; and institutional experience; existence of
3
technical and managerial skills, leadership, and management capacity to achieve the expected re-
Building a Sustainable
M&E System
sults; efficient and reliable information system to monitor and assess effectiveness and efficiency in
delivering outputs to achieve desired outcomes and impacts for the targeted groups; sound budget
planning, formulation and execution that focus on priority policies and programmes. Are there other
organisations such as universities, private consultants or government agencies that have the capacity
to provide technical assistance and/or training?
67. Capacity in the areas mentioned above is needed to develop, support, and sustain the M&E
system. Where capacity is inadequate or lacking, staff need to be trained in modern data collection,
monitoring methods, and analysis. Technical assistance and training for capacity and institutional de-
velopment may be required. The government and development partners are often willing to finance
and support such activities, and share lessons from best practice.
68. Assess the roles and responsibilities and existing structures to monitor and evaluate develop-
ment impact. In most county governments, different departments will be at different stages in their
ability to monitor and evaluate. It should not necessarily be assumed that all departments in a county
are moving in tandem and at the same pace. There will inevitably be some sequencing and phasing
with respect to the building of M&E systems. The readiness assessment serves as a guide through the
political system, and helps identify the ability of county government departments and agencies to
monitor and evaluate. One should first focus on nurturing those elements in the county government
that are able and willing to move faster in developing an effective M&E culture.
69. As already mentioned in section 1.2.2, some functions are shared between the national and
county governments. The readiness assessment will identify overlaps among these concurrent func-
tions, so that overall programme performance can be more effectively and efficiently measured and
achieved. The readiness assessment can assist in brokering differences between departments and
ministries doing the same or similar functions or tasks in a county.
70. County government policymakers need to be in communication with and work in partnership
with agencies responsible for information gathering and dissemination—for example in areas such as
the SDGs. The M&E system needs to be integrated into the policy arena of the SDGs so that it will be
clear to all stakeholders why it is important to collect data, how the information will be used to inform
the efforts of the county government and civil society to achieve the SDGs, and what information
needs to be collected.
18
County Performance Management System Handbook
3
particular ward of a county. From these goals, specific desired outcomes can be determined.
M&E System
Building a Sustainable
73. It is important to keep in mind that it is a constitutional responsibility to involve county citizens
and other stakeholders in developing outcomes. The overall process of setting the outcomes should
begin with identifying specific stakeholder representatives who should be consulted. Who are the
key parties involved around an issue (health, education, energy, etc.)? How are they categorised –
for example, CSOs, government, development partners? Whose interests and views are to be given
priority? Use the Engagement Checklist in Form 5A (presented in Appendix A5) and the Stakeholder
Participation Assessment form in Appendix A6 to identify stakeholder representatives to be consult-
ed. The result of consultations and formulation at county level is the County Integrated Development
Plan, defining a set of projects outlined in Chapter 7 of the CIDP. The outcomes and goals of the CIDP
are presented in the CIDP Results Matrix form in Appendix A8.
74. After stakeholder groups have been identified, the following factors are important in compiling
the outcomes:
(i) Identify major concerns of each stakeholder group. Use information gathering techniques
such as brainstorming, focus groups, surveys, and interviews to determine the interests
and priorities of each of the groups involved.
(ii) Translate Problems into Statements of Possible Outcome Improvements. An outcome state-
ment should be formulated positively rather than negatively. Stakeholders will respond
and rally better to positive statements, for example: “We want improved health for in-
fants and children,” rather than “We want fewer infants and children to become ill.” Positive
statements to which stakeholders can aspire carry more legitimacy. It is easier to build a
political consensus by speaking positively about the desired outcomes of stakeholders.
(iii) Outcomes should be disaggregated sufficiently to capture only one improvement area in
each outcome statement. EE.g. Outcome: Increase the percentage of employed people.
To know whether this outcome has been achieved, the goal needs to be disaggregated
to answer the following questions: For which population or target group? Where? How
much? By when?
This outcome can be disaggregated by examining increased employment in terms of a target group,
sector, percentage change and timeframe. EE.g. Disaggregated outcome: Increase employment
among youth in the rural sector by 20 percent over the next four years.”
19
County Performance Management System Handbook
75. It is important to develop a Monitoring and Evaluation Plan for each project or programme. This
is addressed in later steps, where each project must be monitored and reports compiled regularly.
Timelines for M&E are clearly defined by using the quarterly calendar.
76. Note that, according to these guidelines, there is a standard M&E plan and there are standard
responsibilities for all county projects and programmes. Coordination and joint reporting for each
county department is the responsibility of a designated M&E officer. The coordinated reporting of
the whole county and upwards towards the County Executive, Council of Governors, national gov-
ernment level and the Senate is the responsibility of the M&E Unit, providing reports for approval by
the County Monitoring and Evaluation Committee (CoMEC). It is essential that clear outcomes and
targets be defined at county and national levels. Monitoring is essential to determine whether devel-
opment and service delivery is on track.
77. In these guidelines, the outcomes and goals (impact) of every project and programme includ-
ed in the CIDP are recorded in the CIDP Results Matrix Form A8 and in more detail in the Project
Sheet (see Appendix 10A). Check list A3 (see Appendix A3) should be used to assess the quality of
the process used during the CIDP preparation process and for related documents. Checklist A4 (see
3
Appendix A4) should be used to assess the quality of the process used in developing the ADP and the
Building a Sustainable
M&E System
related documents.
20
County Performance Management System Handbook
3
Model, based on form A7. The results from all CIDP projects can be simply recorded in the appropriate
M&E System
Building a Sustainable
column of the CIDP Results Matrix A8, and when available in the online in e-CIMES. Baseline indicator
values should be recorded at CIDP level in the CIDP Results Matrix Form A4, and at project level in the
Project LFM Form A10.
85. Performance indicators should be Specific, Measurable, Accountable, Realistic and Time-bound
(SMART10). The more precise and coherent the indicators, the better focused and useful the measurement
strategies will be. If any one of these five criteria are not met, formal performance indicators will suffer.
Specific Precise and unambiguous
Measurable Provide a sufficiently quantified basis to assess performance; or at least a
perception of performance, on say a 5-point scale
Accountable Responsibility assigned
Realistic Achievable
Time-bound Specify when the result(s) can be achieved
86. Performance indicators should be as specific, direct, and unambiguous as possible. Indicators
may be qualitative or quantitative. Quantitative indicators should be reported in terms of a specific
number (number, mean, or median) or percentage. “Percentages can also be expressed in a variety
of ways, e.g., percent that fell into a particular outcome category . . . percent that fell above or below
some targeted value . . . and percent that fell into particular outcome intervals…” (Hatry 1999, p. 63).
Outcome indicators are often expressed as the number or percent (proportion or rate) of something.
Managers of CIDP projects and programmes should consider including both forms.
87. Qualitative indicators/targets imply qualitative assessments. A qualitative indicator might meas-
ure perception, such as the level of empowerment, that local government officials feel, so they can
adequately do their jobs. Qualitative indicators might also include a description of a behaviour, such
as the level of mastery of a newly learned skill. Although there is a role for qualitative data, it is more
time consuming to collect, measure, and distil, especially in the early stages. Furthermore, qualitative
indicators are harder to verify because they often involve subjective judgments about circumstances
at a given time. For this reason, qualitative indicators should be used with caution.
88. Sometimes it is difficult to measure the outcome indicator directly, so proxy indicators are
needed. Indirect, or proxy, indicators should be used only when data for direct indicators are not avail-
able, when data collection will be too costly, or if it is not feasible to collect data at regular intervals.
21
County Performance Management System Handbook
However, caution should be exercised in using proxy indicators, because there has to be a presump-
tion that the proxy indicator is giving at least approximate evidence on performance.
89. Constructing indicators can be complex or difficult work. Therefore it is especially important
that competent technical and policy experts participate in the process of indicator construction. All
perspectives need to be taken into account when considering indicators. The indicators should be
substantively feasible, technically doable, and policy relevant.
90. Figure 2 contains an example of baseline data for a policy area – primary education. It builds
on the performance framework mentioned in section 3.3.2. The challenge is to obtain adequate base-
line information on each of the performance indicators for each outcome. This can quickly become
a complex process. It is important to be judicious in the number of indicators chosen, because each
indicator will need data collection, analysis and reporting systems behind it. The selected outcome
is to improve children’s learning. So in this case there must be an indicator for students. Scores on
achievement tests could be a suitable indicator that meets the “SMART” test.
91. The next challenge is to obtain adequate baseline information on each of the performance
3
indicators for each outcome presented in Figure 3. The following key questions should be asked in
Building a Sustainable
M&E System
building baseline information for every indicator: What are the sources of data? What are the data
collection methods? Who will collect the data? How often will the data be collected? What is the cost
and difficulty of collecting the data? Who will analyse the data? Who will report the data? Who will use
the data? These questions need to be answered for each of the identified indicators. Table 2 presents
a framework that can be used to complete the required information for each of the indicators.
Figure 2. Developing Baseline Data for one Policy Area – Primary Education
Indicator Data Data Who will How What will Who will Who will Who will
Source Collection collect the frequently be cost analyse report on use the
Method data? will the and the the data? the data? data?
data be difficulties
collected? involved in
collecting
the data?
1
2
3
4
22
County Performance Management System Handbook
3
tance of taking baselines seriously. There must be a clear understanding of the baseline starting point.
M&E System
Building a Sustainable
Another consideration is the expected funding and resource levels expected to be availed through-
out the target period. A third factor is political concerns. For example, what has the county govern-
ment promised to deliver in its election manifesto?
95. Note that setting realistic targets involves the recognition that most desired outcomes are not
quickly achieved. Thus there is a need to establish targets as short-term objectives on the path to
achieving an outcome. The tendency is to set interim targets over shorter periods of time when inputs
can be better known or estimated. “Between the baseline and the targeted outcome, there may be
several milestones (interim targets) that correspond to expected performance at periodic intervals”11
(UNDP 2002, p. 66). In the case of CIMES, quarterly indicator targets should be entered into the Form
A8 for the CIDP as a whole, and in Form A10 for each individual project or programme.
96. The completed matrix of outcomes, indicators, baselines, and targets becomes the perfor-
mance framework. It defines outcomes and plans for the design of a results-based M&E system that
will, in turn, begin to provide information on whether interim targets are being achieved on the way
to the longer-term outcome. Figure 3 illustrates the completed performance framework for a coun-
ty education development policy area. The formula for arriving at the target performance, involves
setting baseline indicator levels and desired levels of improvement over a specified period of time.
Figure 3. Example of a Framework for Developing Targets for Primary Education Policy Area
23
County Performance Management System Handbook
The desired improvement levels needed to realise the set targets should be arrived at through a par-
ticipatory and collaborative process with relevant stakeholders and any development partners.
3.3.6 Step 6: Regularly Collect Data to Assess Whether the Targets are Met
97. Monitoring tests whether the targets set are achieved in reality. Since targets are set quarterly,
indicator data should be collected quarterly or as designated in the CIDP Results Matrix Form A8 or
the Project LFM Form 10. Data to monitor CIDP, CDF or devolved funds may be collected quarterly,
monthly, biannually or annually as appropriate. There is a pre-existing responsibility and process in
the counties which is to report the PC results of each CEC member and their respective ministry quar-
terly by the 14th of the month following each quarter. The data collection process for PC results is
already required, and the results are already assessed against targets.
98. The monitoring system strategy being designed should include a clear data collection and anal-
ysis plan, detailing the following: units of analysis (for example, school district, community, hospital,
village, region); sampling procedures; data collection instruments to be used: frequency of data col-
3
lection; expected methods of data analysis and interpretation; those responsible for collecting the
data; data collection partners, if any; those responsible for analysing, interpreting and reporting data;
Building a Sustainable
M&E System
for whom the information is needed; dissemination procedures; and follow-up on findings.
99. People need to know and understand what information is being collected by their own depart-
ment and by other departments within the county. For instance, there might be one county depart-
ment that is collecting data that would be suitable for another. In addition, if each department starts
its own information system, there may not be sufficient capacity to sustain all of the systems, and
there may be incompatibilities.
100. Timeliness consists of three elements: frequency (how often data are collected); currency (how
recently data have been collected); and accessibility (data availability to support management de-
cisions). If the data are not available to county decision-makers when they need it, the information
becomes mere historical data. County management teams require accurate and timely information.
Real-time, continuous data that decision-makers can use to lead and manage in their work environ-
ment is essential. It makes little sense to manage county governments using data that may be three
or more years old.
101. The data are collected by the project managers responsible for each project of the CIDP, and re-
ported in the CIDP Results Matrix Form A8. Results are approved by the Director12 of the Department
in which the project is located; the approved results are then collated in each department by the M&E
Officer responsible for that Department, and the M&E Unit thereafter compiles an M&E report that
is passed to the CoMEC for approval and onward submission to the PMS Unit and the relevant M&E
committees.
102. The intent of both the project manager reporting and evaluation, followed by the director’s ap-
proval, is to verify results and record lessons learned, and this goes into the subsequent decision-mak-
ing process. It is the responsibility of the project manager to record lessons learned; for the M&E
Officer to collate learnings per county department; and of the County M&E unit to consolidate and
disseminate learnings for the county. An evaluation can also be undertaken to confirm the viability of
the design before a project or programme is implemented.
12 In a case where a project manager reports to the deputy director of a county department, the latter will be responsible for ap-
proving the Results Matrix.
24
County Performance Management System Handbook
3
Report (CAPER) are produced, based on any sub-county disaggregated reporting and according to
M&E System
Building a Sustainable
the county’s annual calendar, as indicated in Appendix A14. This timing is necessary and should be
strictly followed to facilitate timely input to the national APR as per the calendar specified in the
National M&E Framework.
villages. Preparation of detailed progress reports for the sub-counties and below is not included in
these guidelines.
110. The Form A11 can be used to aggregate the results of multiple Project LFM Forms to produce
an aggregated result for the county (CoMER), for a sector (SMER), a sub-county (SCoMER), a ward
(WaMER) and village (ViMER) sector or for the county. In practice, these disaggregated reports should
only be considered when there is robust capacity and timely, dependable M&E and economic report-
ing at county level. The Appendix A14 will be added and updated from time to time to provide simple
M&E Report (MER) Forms and Appendix A14 will similarly provide simple Economic Progress Report
Forms with versions for sub-county levels as and when required.
ty assemblies, Council of Governors and the National and county government summit; to engage
citizens, development partners and other stakeholders; and to inform constitutional commissions
Building a Sustainable
M&E System
and independent offices. In this way the legal responsibilities of the county government to report
and contribute to national M&E products, including the APRs and the national aggregation of M&E
reports, will be fulfilled.
112. At county sector level, a quarterly review of results at sector department should be chaired
by the CEC Member responsible for the sector in the county, with the Chief Officer as Secretary, to
determine the level of ADP and workplan progress for the department, and to define preventive
and corrective actions as required. The content of this report is prepared by the M&E Officer for the
Department; however its content and presentation is the responsibility of the Director responsible for
the Department.
113. The quarterly sectoral review meetings mentioned above are followed by a quarterly review of
aggregate results, broken-down by county departments, to determine the level of progress made in
the entire county and to define preventive and corrective actions as required. This quarterly review
meeting is chaired by the Governor, with the County Secretary as secretary. The preparation of re-
ports at county level is the responsibility of the M&E Unit, and the formal approval is the responsibility
of the CoMEC. The CoMEC will review the findings of the county APR, and recommend actions to be
taken to address issues highlighted in the report.
26
County Performance Management System Handbook
116. Sustained use of M&E is better ensured through the clear roles and responsibilities and insti-
tutional arrangements as set out in chapter 4 of these Guidelines. Further process, habits, roles and
responsibilities all combine to strengthen the ongoing readiness of the county for M&E.
117. A draft County Performance Management System Handbook (PMS Handbook) is being pre-
pared to further assist in sustaining M&E as a part of the County Performance Management Cycle. The
Handbook will provide further detail of roles, responsibilities, job descriptions and meeting agendas
that may be used to operationalise and embed M&E within the county.
118. To ensure the sustainability of county M&E, be aware of the following critical components of
demand:
• Political Leadership — from the Governor’s Office and from the MCAs
• Adequate Budget — budget is set aside for both the projects to be monitored and for the
M&E process itself
• Clear Roles and Responsibilities — Establish formal and clear organisational lines of au-
thority for collecting, analysing and reporting performance information
3
• Use of Trustworthy and Credible Information — information produced by the M&E system
M&E System
Building a Sustainable
should be transparent and subject to independent verification
• Accountability and Transparency — the media, the private sector and others all have ac-
cess to the M&E reports, and problems are acknowledged and addressed.
• Regular Review Meetings — results- and action-focussed meetings are regularly con-
ducted to ensure that accountable officers are delivering the results expected
• Capacity building — sound technical skills in data collection and analysis are built; and
also managerial skills in strategic goal setting and organisational development
• Incentives — success is acknowledged and rewarded; lessons learnt are applied to influ-
ence design of future projects
• Relevance — M&E reports service the needs of county senior management staff, includ-
ing the Governor, CEC members and the County Commissioner.
27
Institutional Set-up for
County M&E 4
Responsibilty and
Oversee
delivery,
County
M&E
Technical
Oversight
Sector
Monitoring
frequency
of
S MEC
at
quality,
timeliness
and
Committee Committees and
Evaluation
Sector
level
in
support
fitness
for
purpose
of
Prescribe
methodologies
and
expert
of
functions
of
CoMEC
M&E
reports (CoMEC) advice Committees
(SMEC) in
county
Prepared
by
Gaiasoft for
t he
Monitoring
and
Evaluation
Department,
Ministry
of
Devolution
&
Planning.
Institutional Set-up for
4 County M&E
119. This chapter, provides guidelines for M&E reporting arrangements in a county government.
County public service performance management involves; strategic planning, work planning, target
setting, tracking performance through the M&E system and reporting. To understand better the M&E
reporting arrangements at county level, we give a brief description of the organisational structure,
and work planning and setting of performance targets for individual staff members of the county
government.
4
County M&E
Institutional Set-up for
4.2 Work Planning and Setting of Performance Targets
121. At the beginning of every financial year, every project manager, who is responsible for imple-
mentation of each CIDP project or programme, develops an individual work plan and signs a perfor-
mance appraisal report with his/her supervisor, (i.e. the Director of the county Department in which
the project/programme is located), based on agreed project/programme performance indicators
and targets outlined in the CIDP Results Matrix Form A8. The individual work plans are derived from
the departmental work plans and the officer’s job description. The work plan briefly describes the
M&E performance targets or expected results from specific tasks and activities for which the project
manager is responsible during that fiscal year. The M&E performance indicators and targets for each
project manager are then collated by the M&E Officer responsible for each departmental work plan,
which includes among others, departmental priority projects and programmes that are outlined in
the CIDP.
122. Directors of Departments will thereafter discuss the work plan and performance targets with
individual project managers, and ensure that the objectives and performance targets of the depart-
ment are understood. The expected results may include tracking progress on agreed activities of CIDP
projects/programmes during the period of assessment. For each activity to be assessed properly,
there must be clear and measurable indicators of success. A chart showing the County Governments
Administrative Structure, based on administrative staff, is presented in Appendix A13.
29
County Performance Management System Handbook
Table 3. Membership and Responsibilities of Major Committees on M&E preparation and Reporting.
30
County Performance Management System Handbook
4
government nominated by • Drive service delivery through
County M&E
Institutional Set-up for
the County Commissioner in Performance Management
writing. and M&E.
• Receive, review and approve
Membership:
county and sub-county CIDP,
• Heads of technical depart-
Annual Development Plans,
ments of the national govern-
workplans, M&E workplans
ment at county level
and M&E reports.
• County chief officers
• Convening County Citizen
• County Assembly Clerk
Participation Fora.
• Court Registrar
• Mobilisation of resources to
• Representatives from de-
undertake M&E at county and
volved funds
sub-county level.
• Technical Representatives
• Approve and endorse final
managing all other Non-
county indicators.
Devolved Funds in the
• Submission of M&E reports
County.
to CEC, Council of Governors,
Convenor: constitutional offices and
• Chief Officer or county direc- other relevant institutions,
tor responsible for planning including MED.
and M&E functions. • Dissemination of M&E reports
and other findings to stake-
holders, including to County
Fora.
31
County Performance Management System Handbook
M&E Unit Chair: Director of County • Provide technical support and Quarterly
Economic Planning Department. coordination of CIMES, in-
cluding its institutionalisation
Membership:
within the county;
• M&E Officers under Director of
• Prepare periodic CIMES per-
Economic Planning.
formance reports for presenta-
Convenor: County M&E Officer tion to CoMEC;
(CoM&EO). • Supporting the development
of capacity for M&E through
training, coaching and
mentoring.
4
32
County Performance Management System Handbook
Service Delivery • Efficiency Officers reporting Reports directly to the Governor’s Monthly
Secretariat (Optional) on behalf of each depart- Office on service delivery, and
ment to the Governor’s accountability issues to drive CIDP
4
Office. SDS members may be implementation and results.
County M&E
Institutional Set-up for
called upon to attend CoMEC • Provides real-time information
meetings as information and for use by the CoMEC.
evidence providers. • Governor’s office and chief
officers do not need to wait for
CoMEC vetted and approved
reports to know the status of
service delivery.
• However, the CoMEC is re-
sponsible for final vetting
of reports for release to
recipients
33
County Performance Management System Handbook
Stakeholder Responsibilities
County • Provides vision and leadership and drives delivery of the CIDP projects and pro-
Governor grammes through each ADP.
• Holds county CEC Members to account through their Performance Contracts.
• Chairs Performance Management and M&E sessions at county level (executive com-
mittee meetings).
• Holds CEC Members and County Secretary to account for use of the PMS to provide
real-time reporting on service delivery and results.
County • Responsible for coordination of activities in county government.
Secretary • Personally accountable for ensuring that all Chief Officers’ ministries operate as
required
• Provide timely and accurate reporting according to the County PMS Policy.
Development • Coordinates integrated development planning within the county.
Planning • Ensures integrated planning within the county.
Director • Ensures linkages between CIDP, MTP and Vision 2030.
• Ensures meaningful engagement of citizens in the CIMES and CIDP preparation and
implementation processes.
• Ensures the collection, collation, storage and updating of data and information need-
ed for the planning and M&E processes.
4
of sector
departments at
county level
County M&E A. Set up the monitoring and evaluation system:
Officers a. Develop the overall framework of the integrated monitoring and evaluation activities;
b. Clarify the responsibilities and prepare the work plan and the detailed budget for
the monitoring and evaluation activities;
c. Supervise the work of the Monitoring and Evaluation office staff; provide guidance
and technical support;
d. Guide and coordinate the review of the Results Matrix including:
i. ensuring that realistic intermediate and end-of-programme/project targets are
defined;
ii. conducting a baseline study on monitoring and evaluation;
iii. identifying sources of data, collection methods and resources needed and related cost;
e. Contribute to the development of the county M&E Implementation Plan, ensuring
alignment with CIDP, agreement on project/programme indicators and inclusion
of monitoring and evaluation activities in the work plan;
f. Establish contacts with national and other county monitoring and evaluation
stakeholders;
g. Review and provide feedback to programmes on the quality of methodologies es-
tablished to collect monitoring data, and document the protocols that are in place
for the collection and aggregation of this data;
h. Establish an effective system for assessing the validity of monitoring and evalua-
tion data through a review of CIDP implementation activities, completed monitor-
ing forms/databases, and a review of aggregate-level statistics reported;
34
County Performance Management System Handbook
Stakeholder Responsibilities
B. Implementation of monitoring and evaluation activities
a. Oversee the monitoring and evaluation activities included in the CIDP, with par-
ticular focus on results and impacts as well as in lesson learning;
b. Promote a results-based approach to monitoring and evaluation, emphasising
results and impacts;
c. Coordinate the preparation of all monitoring and evaluation reports; guide staff
and executing partners in preparing their progress reports in accordance with
approved reporting formats and ensure their timely submission;
d. Prepare consolidated progress reports for the CoMEC, including identification of
problems, causes of potential bottlenecks in implementation, and provision of
specific recommendations;
e. Check that monitoring data are discussed in the appropriate committees, (includ-
ing citizens participation fora), and in a timely fashion in terms of implications for
future action;
f. Undertake regular field visits to support implementation of monitoring and
evaluation, check the quality of data produced, and identify where adaptations
might be needed; monitor the follow up of evaluation recommendations with
Programme Managers;
g. Foster participatory planning and monitoring;
h. organise and provide refresher training in monitoring and evaluation for CIDP
projects/programmes and other agencies implementing staff, county-based NGOs
and key county stakeholders with a view to developing local monitoring and eval-
uation capacity;
4
C. Lessons learnt:
a. Consolidates a culture of lessons learning involving all project’s staff and allocate
County M&E
Institutional Set-up for
specific responsibilities;
b. Facilitate exchange of experiences by supporting and coordinating participation in
network of CM&EOs working in the all the county governments sharing common
characteristics;
Identify and participate in additional networks such as NIMES networks that may also
yield lessons that can benefit implementation of CIMES.
Central level External Facilitator and neutral validator
representative
(MED)
Representatives Other stakeholders
from other
counties
M&E County To be headed by a County M&E Officer, assisted by several sector M&E officers/
Unit: With two focal points, each responsible for compilation of M&E data for a number of projects/
sub-units (1 for programmes of specified departments and national government: Several IT Officers
county & 1 for assisting the county departments with M&E computerisation activities. The M&E Officer
national) and ICT Officer ensure that the PMS system is supported by projects in their county
departments. M&E officer works with the M&E Technical Committee.
• The overall responsibility for ensuring use of the M&E system in the county lies with
the Director of County Economic Planning Department, who is also responsible for
preparation of the overall CIDP document, and works closely with the Chief Officer in
the Governor’s office to ensure timely production of M&E reports.
35
County Performance Management System Handbook
Stakeholder Responsibilities
Technical • Sets the strategic direction for CIMES.
Oversight • Approves M&E Unit’s work plan and advises M&E unit on actions to be taken on vari-
Committee ous M&E issues.
• Approves indicator reports for use by CoMEC.
• Endorses M&E unit’s reports to be presented to CoMEC and to the Governor and CEC.
• Chaired by Director of County Planning Department
• Reports to County Secretary.
127. Accountability and responsiveness are improved when information flows transparently through
a service delivery organisation or project. Preparation of M&E and related performance information
should happen as close to the point of service delivery as possible, with project managers held ac-
countable for ensuring that their projects are suitably monitored and evaluated. This means that the
project sheets/forms presented in the Appendixes to these Guidelines must be regularly completed
and updated, and the results submitted to the relevant staff in the M&E Unit. The M&E Evidence
Base must be made accessible and visible within the M&E county information system. This could be
achieved much more readily through the development of an online system that meets the selection
criteria in Appendix A9.
36
County Performance Management System Handbook
129. The Governor’s roles relating to the M&E process include: (i) Championing, tracking and im-
plementing the county’s vision through the CIDP; (ii) Ensuring that M&E structures are established in
the county; (iii) Championing M&E and Performance Management as tools for delivery of develop-
ment and services in the county; (iv) Promoting the role of the M&E Unit in advancing Results Based
Management and public service delivery that ensures the CIDP objectives and outcomes meet the
needs of citizens; and (v) Sharing County APR reports on implementation of the CIDP with the County
Assembly, MED, the Council of Governors, county citizens and other stakeholders.
4
ment and services in the county; (iii) work with the Director of the Economic Planning Department to
County M&E
Institutional Set-up for
ensure timely production and distribution of the County APR report on CIDP implementation to the
CEC, the Intergovernmental Forum, the County Assembly, the MDP, the Intergovernmental Summit
and the Senate.
37
Institutional Set-up for
County M&E 4
Figure 4.
38
holder Reporting Governors templates, processes and
and Consultations Non-Devolved M&E Unit Devolved M&E Unit Information checklists.
Devolved Secretariat to CoMEC. Devolved Secretariat to CoMEC. Sharing
Supports compliance with M&E Policy. Supports compliance with PMS Policy. Mandate
Role of the CoMEC in County Monitoring & Performance
4
mentation and evaluation – both directly and through Efficiency Officers embedded in
County M&E
Institutional Set-up for
devolved functions
• Is the one stop-shop for instant access to all county development reports
• Uses technology-supported Performance/M&E/Reporting systems for efficient, account-
able and transparent working
• Prepares reports for all Constitutional Commissions and independent offices, e.g. CoB
and Commission of Revenue Allocation (CRA), and provides evidence of value for money
in support of the Audit office (AOG)
• Ensures programmes are implemented as per the budget, the CIDP and the Annual Work
Plans
• Reports to the Governor, the County Executive Committee and CoMEC
• Makes annual progress reports available to Members of the County Assembly
• Provides flow of approved results information for media and citizens
136. The work of the SDS and CoMEC may be supported by inclusion of compliance and reporting
requirements in the job descriptions and performance contracts of Chief Officers, project and service
managers.
137. It is highly recommended that SDS work be computerised. If no computerised system is in
place, the SDS has a huge task of collating and aggregating the data in hundreds or thousands of pro-
ject sheets from all projects for each sector within the county, and from these to deliver sector-level
performance reports. Thus manual processing of project data is more than likely to delay SDS’s key
function of submitting reports in real time to the Governor’s Office, CoMEC and other stakeholders.
39
County Performance Management System Handbook
138. The less analytical capacity the County has, the more important it is for the county to have au-
tomated Performance Management and M&E Systems, rather than rely on spreadsheet documents
which must be manually managed in a cumbersome process. Appendix A9 provides criteria for the
selection of an online system for performance management of the CIDP implementation.
4 County M&E
Institutional Set-up for
40
Reporting, Dissemination
5 and Citizen Engagement
139. This chapter helps county management teams, programme and project staff and staff of the
county M&E units to effectively apply information from monitoring and evaluation in their daily work.
This is done in the context of accountability, performance improvement, decision making and learn-
ing. It emphasises the need to design standard forms that can be used by all county governments
to collect data and other information used in compiling M&E progress reports. It further presents an
effective means of publishing and disseminating M&E results.
140. Directors of county departments are accountable for establishing M&E work plans for their in-
dividual departments, and also for the M&E results structure, which links all programmes/projects of
the department to the expected CIDP outcomes. This is the basis for performance monitoring and
reporting, and its development is monitored closely by the county M&E Unit to ensure adherence to
the CIMES guidelines and the overall national M&E Policy. The performance monitoring of individual
project/programmes is an ongoing responsibility of individual PDOs and their supervisors (senior
project delivery managers), although development of the underlying programme/project results ma-
trix and the identification of appropriate performance targets and indicators will be undertaken with
the assistance of the County M&E Officer.
programme/project is having or has had the desired impact; and (e) whether new information has
emerged that requires a strengthening and/or modification to the project management plan.
142. Standardised M&E reporting forms enable M&E unit staff to aggregate data from many pro-
jects/programmes. It also facilitates aggregation of M&E reports from the 47 county governments.
A simple, standardised reporting format also reduces the need for capacity building required to do
the job, and enables comparison of results within and between counties. For this reason, all counties
are required to report based on standard reporting templates that are similar and simplified so as to
eliminate unnecessary reporting burdens.
143. Priority should be given to preparing high quality reports at county level annually, followed
by brief quarterly reports at sector and sub county levels. MED and CoG have developed simplified
41
County Performance Management System Handbook
reporting formats, and will soon provide these to the county governments. A summary of the reports
to be produced by every county government is outlined in Appendix A14.
(c) Sub-counties, through the SCoMEC, submit their reports to County M&E unit seven days
after the quarter ends, following the quarter to which the report is referring.
(d) County M&E units can thereafter compile the county M&E report for onward submission
to CoMEC.
148. If CIMES is computerised, all of these reports will automatically populate a CoMER (County M&E
Report) template within e-CIMES. The County M&E Officer will edit free sections of the COMER e-form
to complete the quarterly and annual progress report (APR) for the county. Reporting for the Sub-
county will be completed by the 15th day of the month following the reference month, with reporting
by the county of the county APR completed by 31st July. This fits in with the timeline of the National
M&E Framework, which requires that counties submit their M&E reports to MED by 30th August for
MED to prepare its national APR in good time.
149. For reasons given in the above paragraph, these M&E Guidelines recommend electronic cap-
ture of results into e-CIMES, which could also be linked to the county PMS. At the county level, CIMES
42
County Performance Management System Handbook
is a PMS with knowledge sharing links to CoG and to MED. It is at the discretion of the county to have
WaMER forms filled in at Sub-county level for their wards, and to have ViMER forms filled in at ward
level for their villages. A village M&E report may be as simple as monitoring the key development
priorities and issues of the village. If these forms are filled at Ward level for villages, the MCA for the
Ward may then choose to engage citizens in contributing to and approving the forms at Ward level to
help to drive forward the priority projects/programmes of each village in the Ward.
150. Where production of progress reports is devolved to the village level, village Secretaries may
engage a technically skilled youth to fill the appropriate form on a mobile device. Over time and
based on experience, CoG and MED will extend and improve the CIMES online forms used for report-
ing by counties, sub-counties, wards and villages. Initially, these forms must be simple enough to
ensure success, but easily extensible as the capacity of counties and depth of capacity within counties
grows. Each M&E form will prescribe how the M&E will be conducted, what data will be required, and
the depth of reporting.
43
County Performance Management System Handbook
155. It is recommended that all progress reports, including interim reports, are made available to the
Commissions and Independent offices created by the Constitution of Kenya to build trust and receive
feedback from them early in the budget cycle. In every county, it is the county M&E unit’s responsibil-
ity, working closely with county SDS where this exists, to produce reports. As emphasised in chapter
3 of these Guidelines, the M&E reports should be prepared in consultation with the key producers and
users of the reported information. Data comes from aggregated data in the form of Appendix A12 (or
similar formats) produced by M&E Officers and their Directors of departments in the devolved func-
tions, or persons of equivalent rank managing devolved funds and other related agencies.
156. Recommended standard progress reporting systems have been developed by the Council of
Governors and in consultation with MED. These formats will be improved over time. They should be
automated using the Performance Management and M&E System developed by counties. Continuous
improvement of these report formats should be the responsibility of the M&E Units, working closely
with MED and the Council of Governors.
157. The performance of the county against service delivery, programme and project goals and
targets is measured through the ADP Results Matrix. All projects and services contribute to results.
These results should be gathered through completing and updating Project Sheets. The M&E Officers
responsible for each department should then aggregate the collected information by sector, under
the guidance of the M&E Unit. Consolidation of the results to sector level targets should be done
using an auditable and reliable system. (Spreadsheet aggregation is time consuming, unreliable and
expensive to audit.)
158. Based on the review of aggregated progress reports at sector level, weekly or monthly corrective
actions should be defined and delegated at sector level. These sector reports are prepared by M&E Officers
for each Director of Department and submitted to the county M&E Unit for validation, aggregation and
production of reports, which are forwarded to the PMS Secretariat, the CEC and CoMEC. Monthly or quar-
terly performance reviews are recommended at the county level, prepared by the M&E Unit in close collab-
oration with SDS where this exists, and submitted to CoMEC and the Governor’s Office.
CIPD, it is recommended that the Governor’s Office and county M&E Unit work together to develop
Reporting, Dissemination
and Citizen Engagement
performance reports for the County Governor, the CEC, the County Development Board, the County
Assembly and the Senate. Performance reports should also include county projects that strengthen
the performance of e.g. Independent Commissions, or building of law courts in the counties. Quick
and simple configuration of easy to use reports by the M&E Unit staff is an important requirement of
effective Performance Management and M&E systems (see Appendix A9).
160. Wherever progress reporting from county to national departments is required, the data cap-
ture should be carried out consistently in all counties, with the aggregate results made available
for comparison at national and county levels. This is another important reason for the Performance
Management and M&E system to sit on a commonly shared service platform for all counties. In the
absence of a shared system, the Council of Governors and MED should prescribe appropriate formats.
44
County Performance Management System Handbook
163. Data capture and analysis should be done online, and be a shared service so that the data of
all counties can be benchmarked, compared and accessed subject to security rights by consumers of
that data. This does not mean that all counties or national government entities can access the data of
a county, but rather that all data is stored in a compatible form, providing benefits of comparison and
economies of scale. Each national data user, including Independent Commissions and Independent
offices, should be able to access the subset of information relating to their role and function. As stated
in the Selection Criteria of Appendix A9, this should be a requirement of any ICT support platform for
County M&E.
5.2 Dissemination
164. There are several important reasons for using and disseminating county M&E results. The
Constitution of Kenya considers that most M&E Reports must be available to the public, and as such
45
County Performance Management System Handbook
should be shared with county citizens and other stakeholders. Other reasons for disseminating M&E
results include: (i) to improve programme/project interventions; (ii) to strengthen projects/pro-
grammes institutionally; (iii) to advocate for additional resources; (iv) to create citizen awareness and
ownership, and promote “people-friendly” policies; (v) to ensure that county development activities
are captured in NIMES; and (vi) to contribute to the county and national understanding of what works.
This chapter explains these reasons.
should be evidence-based and representative and not selective or misleading. For this reason, dis-
Reporting, Dissemination
and Citizen Engagement
seminated information should be linked to the complete evidence from which the claimed success or
failure is drawn.
adoption. It is recommended that each county assembly creates a calendar that will allow county
assembly representatives to review CoMEC reports biannually.
(c) Press releases can generate media coverage of findings. As more people gain access to
newspapers, radio, television and the Internet, media coverage of M&E findings is gaining
in importance. Many programmes find that the most effective way to reach policymakers
is to encourage media coverage of their evaluation results.
Fact sheets convey findings in a short, concise format. They are especially effective for advocacy, con-
veying information to policymakers and those who do not have the time to read longer reports.
175. Other channels for disseminating M&E Reports and information include:
• Smartphone and tablet computer access
• Performance Management updates
• Performance Dashboards
• Open Data Portals
47
County Performance Management System Handbook
disregarded.
Reporting, Dissemination
and Citizen Engagement
48
County Performance Management System Handbook
5
and Citizen Engagement
Reporting, Dissemination
49
Operationalsing CIMES 6
Operationalising CIMES
6
185. Operationalising CIMES is based on the structure of the county (Appendix A13) and delivers re-
ports as set out in Appendix A14. Committees (Table 1) and Responsibilities of Major Players (Table 4)
have a responsibility to deliver reports according to the reporting calendar as also defined by the report-
ing obligations of the county, as set out in Appendix A15. Producers, approvers and recipients of reports
are also detailed in Appendix A14.
186. It is important that the county leadership – both the Governor’s Office and the County Secretary
– are active drivers and users of the CIMES. Refer to the Readiness Checklist of Appendix A2 for criteria
that must be met for initiating county M&E for results.
187. To ensure the required emphasis on M&E receives full attention from all county staff involved
in its preparation and reporting, M&E targets and indicators should be linked directly to the perfor-
mance management of the county, including Performance Contracts of CEC members and the work
plans of chief officers and ministries.
188. As recognised in the principles of Chapter 3, M&E must be incorporated as a part of the larg-
er CIDP planning, performance management and implementation cycle. A separate County PMS
Handbook addresses the detailed responsibilities in the county PMS implementation programme,
which will ensure that the M&E of CIDP and ADP projects is linked to PMS in every county. Quarterly
performance reviews by ministry and for the entire county should be chaired by the CEC Member and
the Governor respectively. The schedule of meetings and related agendas, roles and responsibilities
are detailed in the County PMS Handbook.
189. County PMS processes incorporate evidence-based M&E, following the ten steps presented in
Chapter 3. Staffing of the Service Delivery Secretariat is drawn from each county department. Further,
the M&E of individual projects within the CIDP Results Matrix is performed by project managers and
approved by the respective Director of the Department in which each project is located. Results are
required quarterly, and meetings to discuss these results should be chaired by the Governor, who
should also ensure that all ministerial programmes and priority projects are monitored and managed
for results. Project managers and their Directors are required to ensure that the county M&E pro-
gramme is given adequate attention within the county performance management process.
190. To get started with county M&E, use the 10 steps of Chapter 3 in the given sequence. To opera-
tionalise the regular process of county M&E and performance management, use the detailed recom-
mendations and processes of the County PMS Handbook.
To assist all counties to monitor and evaluate the implementation of their CIDPs, MED has been de-
veloping these Guidelines, and will work with the Council of Governors and the Kenya School of
6
51
County Performance Management System Handbook
use indicators to inform their work, and KNBS have developed a template of indicators for county
Operationalsing CIMES
governments. Where possible, it is important to ensure that the core indicators in CIMES are aligned
to the revised MTP-2 indicators. Core indicators that are regularly collected and made available for
official use are presented in Appendix A1.
52
County Performance Management System Handbook
197. The CIDP Guidelines and the County Government Act No. 17 2012 outline priority indicators
that must be collected as follows:
“Each CIDP should provide clear input, output and outcome performance indicators, includ-
ing the percentage of households with access to basic services contemplated under Article
43 of the Bill of Rights of the Constitution; the percentage of a county’s capital budget ac-
tually spent on capital projects identified for a particular financial year in terms of the coun-
ty’s ADP; the number of jobs created through any local economic development initiatives,
including capital projects; and financial viability of the integrated development plan in ac-
cordance with nationally applicable ratios.”
198. MED will work closely with CoG to ensure inter-county coordination during the development
of CIMES for each county government. In addition, MED will provide assistance to capacity building
needed at county level. It will also maintain research and M&E services that cater for the needs of both
county and national governments.
necessary funds in accordance with their procedures, which could take considerable time and effort.
Operationalsing CIMES
204. Human resources are critical for effective monitoring and evaluation, even after securing ade-
quate financial resources. For high-quality monitoring and evaluation, there should be:
53
County Performance Management System Handbook
• Dedicated staff time — Specific staff members should be assigned to the M&E func-
tion. The practices of deployment of personnel for monitoring may vary among county
governments. County governments could establish monitoring and evaluation units with
specific terms of reference, skilled staff, work plans and other resources.
• Skilled personnel—Staff entrusted with monitoring should have the required techni-
cal expertise in the area. Where necessary, skill levels should be augmented to meet the
needs, taking into consideration the ongoing county investment portfolio.
205. Except for counties in strong fiscal positions and with low existing capacity, no additional staff
is mandated or proposed for additional M&E responsibilities outlined in the CIMES Guidelines. Most of
these responsibilities should be reassigned among existing county staff, especially planning officers
or economists working in the department responsible for county economic planning and develop-
ment. However, in order to ensure that the county staff members reassigned to M&E activities fully
meet the county government’s M&E needs and, ultimately, increase the quality of M&E programming
at county level, the skills of the selected staff will be augmented through in-house M&E training and
other courses to be offered by MED and other suitable M&E training institutions.
6
Operationalsing CIMES
54
A1 Core County Result Indicators
APPENDICES
9 Pupil:Teacher ratio Ratio Ministry of Education Annually
10 Textbook:Pupil ratio Ratio Ministry of Education Annually
11 Primary exam result average No. National Examination Annually
(Math) Board
12 Primary exam result average No. National Examination Annually
(English) Board
3. Energy 13 Electricity (% households % KNBS Periodically
2009)
4. Gender 14 Women in County % Ministry of Devolution Annually
Assemblies and Planning
15 Proportion of women % Ministry of Devolution Annually
recruited in the public and Planning
sector
5. General 16 Population size No. KNBS Periodically
information 18 Annual population growth % KNBS Periodically
rate (1999-2009) %
19 Surface area (km²) No. KNBS Periodically
20 Density (people per km²) No. KNBS Periodically
21 Poverty rate, based on KIHBS % KNBS Periodically
(%)
22 Share of urban population KNBS Periodically
(%)
23 Labour force participation % KNBS Periodically
rate
55
County Performance Management System Handbook
56
County Performance Management System Handbook
APPENDICES
access to piped water Environment, Water &
Natural Resources
52 Rural households with % Ministry of Periodically
access to safe drinking water Environment, Water &
Natural Resources
53 Urban households with % Ministry of Periodically
individual or shared access Environment, Water &
to toilet facilities Natural Resources
54 Rural households with % Ministry of Periodically
individual or shared access Environment, Water &
to toilet facilities Natural Resources
57
Core County Indicators for Monitoring the Implementation of the
County Budget and Value-for-Money
15. Public participation in the annual budget process Meetings held, Annually CoG
feedback in
CIDP
16. S hare of mandatory citizen engagement/consultation Percentage Annually Online
meetings held
17. P ercentage of county citizens that are aware of public Percentage Annually Survey
planning, budget, and results consultations TBD
18. S hare of awarded procurement contracts with decision criteria Percentage Annually Survey
publicised TBD
19. C
ounty budgets published online Percentage Annually CoG
58
County Performance Management System Handbook
C. Budget cycle
20. Total value of audit qualifications (in ‘000 KES) Measured in Annually AOG
KES, out of total
expenditures
21. Audit qualifications in % of total expenditures Percentages Annually AOG
of total
expenditures
22. Settled audit qualifications in % of value of audit qualifications Out of total Annually AOG
qualified
23. Compliance with cashbook standards (are counties using AOG to clarify Annually AOG
IFMIS for budget management, not just accounting) measurement
D. Value for money
24. Average price paid for a bag of cement (50kgs) KES, AOG Annually AOG
sample
25. Average price paid for a biro pen (normal) KES, AOG Annually AOG
sample
26. Average price paid for a bottle of water (500ml) KES, AOG Annually AOG
sample
APPENDICES
27. Average price paid for a photo copy paper (A4 80mg one KES, AOG Annually AOG
ream) sample
28. Average price paid for a desktop computer KES, AOG Annually AOG
sample
59
A2 Readiness Checklist
The checklist below indicates what actions need to be completed by the county in each ✔/✘
quarter of the financial year. The bold items relate directly to these guidelines.
1 The Expenditure Review is completed in Q1.
2 CIPD Annual Update is made in Q2 through participatory dialogue and
prioritization
3 Annual Budget Revised Outlook Paper is produced from the CIDP update in Q3.
4 From this & the CIDP Annual Update, the Annual County Fiscal Strategy Paper is
produced, in Q3.
5 Annual County Fiscal Strategy Paper includes revenue modelling and budget by
sector, in Q3
6 In Q3 the Returns to the CoB of the Budget Outlook Paper must be made.
7 Budget by sector used by Chief Officers to define service and project plans in
Q4.
8 Annual Development Plan is produced combining sectorial plans in Q4.
9 Performance Contracts of County Secretary & Chief Officers from ADP in Q4.
10 From the ADP, the Annual Budget Estimate is produced in Q4.
11 The Assembly (MCAs) must be sensitized and approve the Budget based on
Annual Budget Estimate in Q4.
APPENDICES
60
A3 CIDP Checklist
Use this checklist to test CIDP process and document quality. A quality CIDP is more likely to be
successfully implemented. Refer to CIDP guidelines Ch. 5-7, Sec 108 of CGA2012.
QUALITY CRITERION YES NO
1 Does CIDP reflect all plans/projects to be implemented in coming year by any
o o
organ of state?
2 Are the CIDP Implementation Matrix (Chap 5) and resource mobilisation (Chap 6)
o o
complete?
3 Were Priority Programmes and Projects developed as per Chap 7 of CIDP
o o
guidelines?
4 Has the CIDP been updated in preparation for the next Annual Development Plan? o o
5 Has there been full consultation to identify & prioritise needs based on value for o o
money?
6 Have alternative scenarios for development been identified with diverse and o o
expert input?
7 Were women and girls consulted and their priorities identified and included in o o
CIDP?
8 Were youth consulted and their priorities identified and included? o o
9 Were private sector consulted and their priorities identified and included? o o
APPENDICES
10 Were the development priority programmes and projects for the county
(Chapter 7) identified:
(a) Through an inclusive participatory process? o o
(b) On the basis of a thorough County Development Analysis (Chapter 2), o o
(c) Informed by the County Spatial Framework (Chapter 3), o o
(d) Including linkage with other Plans (Chapter 4), o o
11 Has a comprehensive implementation matrix been developed (Chapter 5)? o o
12 Is the implementation matrix supported by expected resource mobilisation o o
projections (Chapter 6)?
13 Are the development priority programmes and projects described in sufficient
detail, with respect to:
(a) Complete Logical Framework with Hierarchy of Objectives? (Use Project o o
Sheet A10)
(b) Corresponding indicators at all four levels? o o
(c) Quantitative targets including deadlines and phased timing? o o
(d) Budget and personnel resources required? o o
14 Are Chief of Staff, Chief Officers and Efficiency Officers inspired & committed to the o o
Plan?
61
A4 Annual Development Plan Checklist
The Annual Development Plan is the work plan for the current year of the CIDP. The CIDP is imple-
mented one Annual Development Plan at a time.
QUALITY CRITERION YES NO
1 Does the ADP turn the Governor’s Vision into an actionable plan in the year? o o
2 Does the Annual Development Plan reflect the objectives of the CIDP for the year? o o
3 Has there been full consultation to identify & prioritise needs based on impact for
o o
money?
4 Were women and girls consulted and their priorities systematically identified and o o
included?
5 Were youth consulted and their priorities systematically identified and included? o o
6 Were private sector consulted and their priorities systematically identified and o o
included?
7 A comprehensive implementation matrix has been developed. (Chapter 5) o o
8 This is supported by expected resource mobilisation projections. (Chapter 6)? o o
9 Are the development priority programmes and projects described in sufficient
detail, with
(a) Complete Logical Framework with Hierarchy of Objectives? o o
(b) Corresponding indicators at all four levels?
APPENDICES
62
A5 Engagement Checklist
Has the county developed a stakeholder “map” identifying stakeholder groups to consult throughout
the CIDP process?
Stakeholder Group for Quarterly % Representative Body % of population
Engagement Monitoring population represented by
in category representative bodies
Business
Youth
Women
Civil Society
Religious and Faith Leaders
Minorities
People with disabilities
APPENDICES
Business
Youth
Women
Civil Society
Rel. Leaders
Minorities
People with disabilities
Does the county have a working process for engagement and participation of stakeholder groups?
Stakeholder Group Please describe:
Business
Youth
Women
Civil Society
Religious Leaders
Minorities
People with disabilities
63
A6 Stakeholder Participation Assessment
Use this rating sheet to assess the degree of participation and engagement of key stakeholder groups
in development projects and service delivery of the county. Stakeholder groups should be those
identified in A5. The CIDP, Annual Development Plan and Sector Plans should all be informed by
Stakeholder Participation.
Achievement Description Stakeholder 1 Stakeholder 2 Stakeholder 3
Contact
Representative
Agenda
Round table
Included
Resourced
Output
Outcome
Impact
This maturity model is indicative of CIMES checklists.
APPENDICES
64
A7 Maturity Model for Reporting Status of ADP Projects
APPENDICES
65
A8 Template for CIDP and ADP Performance Management Results Matrix
Template for capture of CIPD/ADP M&E and Performance Management project results. Projects listed
in Chapter 7 of CIDP in Tables i-iv.
This form indicates the information to be collected, but cannot be conveniently used in an A4 docu-
ment format. In practice, to monitor, manage and evaluate CIPD projects an electronic system e-CIMES
is required. In practice, to print this information, an A3 landscape printout is required. This template
will be updated from time to time and available from MoDP (MED) and Council of Governors.
A B C D E F G H
Ministry/ Priority Flagship Project is In CEC Project or Status SMART
Sector A, B, C ✔/✘ in ADP? Member’s Service Name (See A7 for Objectives
PC? ADP Reporting for each
✔/✘ Location/
✔/✘ Maturity project
Division/
Model)
Or Name Constituency
APPENDICES
66
I J K L M N O
Measures Evidence to be Annual and Description SMART Actions SRO PDO
for each provided for Quarterly of Activities (see second
Senior Resp. Project
Objective evaluation of Target page of
Owner Delivery
each indicator for each Appendix A10
Officer
objective for format
APPENDICES
67
COUNTY: ___________________ DATE: _______________
EFFICIENCY OFFICER: ___________________ MINISTRY: _____________________
1. Ensure the commitment of the Governor to performance management and M&E as a tool
for driving development results and a tool for holding CEC Members to account and for
CEC Members to drive results.
2. Ensure the commitment of the County Secretary to performance management and M&E
as a tool for coordinating the work of ministries and a mandated tool for Chief Officers
and Directors.
3. Complete form A8 to define three high priority and high impact ADP projects per ministry.
A B C D E F G H
Ministry/ Priority Flagship In ADP? In Mins PC? Project or Service Status (See SMART
Sector A, B, C ✔/✘ ✔/✘ ✔/✘ Name A7 for ADP Objectives
Reporting for each
Or Name Location/Division/
Maturity project
Constituency
Model)
APPENDICES
68
I J K L M N O
Measures Evidence to be Annual and Description of SMART Actions SRO PDO
for each provided for Quarterly Activities (see second
Senior Project
Objective evaluation of Target page of
Resp. Delivery
each indicator for each Appendix A10
Owner Officer
objective for format
APPENDICES
69
A9 Selection Criteria for Performance Management M&E System
Actor)
9 Configurable Results Matrix (CIDP, Annual Development Plan) o
10 Definable indicators, budgets, targets, variance, traffic lights o
11 Definable assessments using maturity models such as in Appendix A7 o
12 Definable security roles and access (based on county responsibilities) o
13 Delegation of administration rights for local administration o
14 Master administration rights for cross-cutting administration o
15 Indicator definitions can be updated as national or shared indicator definitions are o
changed and improved.
16 The system must support multiple, concurrent read-write, replicas to allow for o
distributed delivery.
Audit, Approval and Evidence
17 Approval of indicators and evidence by Chief Officer reflects ultimate responsibility to o
the Governor.
18 Individual accountability for indicator update, release and tracking of who needs to o
update what.
19 Capture of indicator results with audit log of who made what update and when. o
20 Capture of evidence such as photos, scanned documents, etc. to demonstrate outputs o
and outcomes.
21 Capture of historic results fixed after closing of reporting time window. o
70
A10 Project Sheet or Project Logical Framework Matrix (LFM)
Fill in this Project Sheet for each project or service for which budget has been allocated. These data
sheets are to be added as Annexes to the Annual Development Plan.
County Geo Coord. S
Sector Geo Coord. E
Project Name Responsible
Financial Year Start - End
Quarter Start - End
M&E Data Sheet in results matrix format
(To be read from bottom up: Inputs → Outputs → Outcomes → Impacts)
A B C D
Hierarchy of Indicators Targets Means of Verification
Objectives
Annual Quarterly
Impact (optional) (COUNTY LEVEL INDIC.)
- Direct Benefit (as
assessed by citizens)
- Indirect Benefit
APPENDICES
(statistical sector
indicators)
Outcomes
- Utilisation of services /
infrastructure
- Appreciation of services/
infrastructure.
(as assessed by citizens)
Outputs
- Physical structure
completed
- Services in place (or
advise/training provided,
or campaign done)
Inputs
- Financial Approved budget (in KSHs)
- Personnel Percentage (%) of
approved budget
received from start of FY
Funds (KSHs) spent in
the reporting period:
a) Total (KSHs)
b) Broken down on
major funding sources:
(KSHs):
71
A10 Project Sheet or Project Logical Framework Matrix (LFM) continued
! !
SMART action required Person Responsible mm/dd/yy Initial, Accepted, Not
for action Agreed, Not Now, Off Track,
(Named user of On Track, Revoked, Closed.
system)
APPENDICES
72
Project M&E Using Project Sheets
Use the template above to record a project results matrix for a project or Project Service Delivery Unit
(PSDU). Start with the 3-5 priority projects or services per ministry. Record the county (and sub-coun-
ty), ministry, project name, financial year as well as location or geo coordinates of the project where
applicable. Define the person responsible (PDO) and the planned start and end date. Targets should
be set for annual and quarterly achievements. For each target a means of verification should be de-
fined. The project sheet results should be updated at least quarterly by the PDO with the assistance of
the Efficiency Officer of their ministry. Key actions to be taken by the PDO or others should be record-
ed in the SMART Actions table of A10 and the status of these actions should be updated at least quar-
terly. The report, together with evidence and explanation, should be approved by the SPDM before
passing to the M&E Unit for collation with other results for quarterly M&E reports and performance
management reporting to CoMEC and the Governor’s office as appropriate.
This requires identifying input level indicators (money, resources, activities), output level indicators
(describing the expected deliverables), outcome level indicators (utilisation and appreciation of de-
liverables), and – where practical – impact level indicators, i.e. benefits that are directly attributable to
the utilisation of the deliverables. The logical levels are:
1. Inputs: Budget Implementation and Activities. Indicators refer to the degree to which the al-
located budget for the respective sector has a) been disbursed, and b) been utilised for the
planned activities.
• Example: % of allocated amounts for the construction of new health centres disbursed
APPENDICES
and spent to implement the planned health centre
2. Outputs: Achievement, project completion. This level of indicator describes to what extent
the intended goods have been delivered by the respective sector agencies, or the degree to
which a project has been completed, or the services are ready for service delivery.
• Example: New health centre in the county constructed, equipped, staffed, and operational
3. Outcomes: Utilisation / User satisfaction. This level assesses to what extent the sector services
have actually been used, with users adopting, appreciating or expressing satisfaction with the
services provided.
• Example: Number of assisted deliveries in the health centre
4. Impacts: Benefit. This level assesses the benefits received by the target population from use of
the outputs. Direct benefits are as experienced and assessed by beneficiaries. Indirect bene-
fits are typically measured as statistical changes in highly aggregated development indicators
which can still be attributed to the improvements in the services provided, and the utilisation
of these services by the people in the county. For this level it may only be practical to get data
aggregated at county level, not by individual PSDU or project.
• Example: Change in maternal mortality rate in the county
73
A11 M&E Reporting Sheet Aggregated by (Sub-)Sector/Project Type
Fill in this data sheet for each type of project or service (sub-sector), aggregated from the individual
reporting sheets. During implementation, weekly or monthly status updates should be recorded.
The impact level indicators should be collected from KNBS annually.
Outcomes
- Utilisation of services / Number of patients treated
infrastructure per month
- Appreciation of services % of patients who rate the
/ infrastructure. services received as “good” or
(as assessed by citizens) “very good”
Outputs
- Physical structure No. of rural health centres
completed established and operational
- Services in place (or No. of rural health centres fully
advise/training provided, equipped and staffed with at
or campaign done) least 1 doctor, 3 nurses
Inputs
- Financial Approved Budget ( in Ksh)
- Personnel % of approved budget received
Funds (KSHs) spent in the
reporting period:
a) Total (KSHs):
b) Broken down on major
funding sources (KSHs):
Value of audit qualifications
(in ‘000 Ksh)
Additional Comments
Priority Actions Assigned to: Due by: Status
! !
SMART action required Person Responsible mm/dd/yyyy Initial, Accepted, Not
for action (Named Agreed, Not Now, Off Track,
user of system) On Track, Revoked, Closed.
74
A12 Targets in Project Sheets and Results Matrix
Setting targets
Define realistic objectives and targets in terms of quantity and time for indicators and projects and services
listed. Start with the high priority, high impact projects.
Stakeholder participation
Targets must be set, based on the developmental needs of communities. Remember that there may be quick-
win policy changes, tasks or projects with a low cost and high impact for stakeholders. Use the consultation
process to identify quick-win projects and ensure they are built into the sector plans and objectives of Chief Of-
ficers and their Efficiency Officers. Use Appendix A5 to ensure that participation covers all stakeholder groups
adequately. Use Appendix A6 to ensure that participation is an ongoing process built-into county governance.
Political leaders must provide clear direction as to the importance of the target and how it will address the
APPENDICES
public need, while municipal employees must advise as to what a realistic and achievable target is, given the
available resources, capacity and challenges.
Managers, after consulting operational staff, must advise on seasonal changes and other externalities to be
considered in the process of target setting. By finalising the Annual Development Plan, the county makes a
commitment to achieve these targets within agreed time-frames and to notify all stakeholders of the targets
and time-frames. Since this commitment is cascaded down to the individual level through performance agree-
ments, directorate performance plans and individual performance plans, it is critical that all staff be involved
in the target setting process.
75
A13 County Governments Administrative Structure Based on Officers
The Governor
Deputy Governor
Heads of National Officers from
Government Functions National Police
Service
Sub-County Administrators
Ward Administrators
Village Administrators
Village Council
© 2012 Gabriel Lubale.
76
A14 Key Reports to be Prepared at County Level
APPENDICES
and Evaluation Report Departments (Internal use),
7th
Sub-County Annual Annual TOC, Citizen, County
Public Expenditure Departments (Internal use),
Review (SCAPER) 7th
SMEC Sector Monitoring and Quarter TOC, Sub-County/Ward/
Evaluation Report (SMER) Village Departments
(Internal use), Citizen 7th
Sector Monitoring and Annual TOC, Sub-County/Ward/
Evaluation Report (SMER) Village Departments
(Internal use), Citizen, 7th
Sector Public Annual TOC, Sub-County/Ward/
Expenditure Reports Village Departments(Internal
(SPER) use), Citizen, 7th
WaMEC Ward Monitoring and Quarter Sub-County/ Departments
Evaluation Report (Internal use), Citizen, 1st
(WaMER)
WaMEC Ward Monitoring and Annual Sub-County/
Evaluation Report Departments(Internal use),
(WaMER) Citizen, 1st
ViMEC Village Monitoring Month Ward/ Departments (Internal
and Evaluation Report use), Citizen, 23rd of the last
(ViMER) month previous
77
A15 NIMES Operational Arrangements
1. The operational arrangement for implementation, coordination and reporting of NIMES re-
sults at national level consists of two guiding committees and five Technical Advisory Groups
(TAGs). The committees and the five TAGs are as given below.
79
CIMES Guidelines and the National Capacity Building Framework
These County Integrated Monitoring and Evaluation System (CIMES) Guidelines were
developed as part of the Government of Kenya’s (GoK), National Capacity Building
Framework (NCBF). The NCBF provides a mechanism for facilitating and coordinating
capacity building initiatives and provides a basis for monitoring and evaluating capacity
development for devolution.
Through the NCBF and GoK strategies, national and county governments, development
partners and other stakeholders will align and guide capacity building efforts to leverage
on ongoing capacity building initiatives as well as mobilizing new resources around the
devolution agenda.
It is through this framework county staff will upgrade their skills and competencies to
perform their responsibilities adequately to enhance service delivery, build structures
and systems to promote and ensure sustainable social economic development and
enhance capacities of management of financial and human resources, county institutions,
community and stakeholders participation.