Switchboard MandE Glossary

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Monitoring and Evaluation Glossary1

Term Definition
Regular meetings that should be held throughout the project cycle to review
Analysis and
monitoring data, assess project progress and implementation processes, identify
action planning
trends, determine the implications of findings, and develop action plans to respond
meetings
or shift course as needed.
Data Observations, measurements, facts, or pieces of information.
The process of interpreting and transforming raw data into information that is
Data analysis
usable and/or that advances knowledge or understanding.
The systematic gathering of data necessary for designing, implementing, and
Data collection
monitoring projects.
Data element The smallest specified unit of data that conveys meaningful information.
The process of compiling, storing, and protecting data. Data management
Data
processes should ensure the protection, accessibility, reliability, integrity, and
management
timeliness of data.
An assessment of the validity, reliability, precision, integrity, and timeliness of data
Data quality
that is collected and reported. It enables managers to identify and respond to data
audit/
quality issues and determines the extent to which existing data can be trusted and
assessment
used to influence project implementation decisions.
An adjective describing activities, processes, or decisions that are shaped
Data-driven or influenced by the analysis of available data as opposed to personal experience or
opinion.
Data-driven Programs where data shapes program design and informs decision-making
programs throughout the life of the program.
Review of existing documents or records, such as project records and reports,
Document
policies or standard operating procedures, written correspondence, photos, videos,
Review
etc.
The process of collecting and analyzing information, typically once or twice during a
Evaluation project (after a completed project or project phase), to assess a project or
program’s outcomes and the factors that influenced results.
The high-level questions an evaluation project is designed to answer (also called
key evaluation questions or KEQs). KEQs are not the individual questions asked
Evaluation
during data collection, such as those used in a survey questionnaire, interview or
Questions
focus group discussion. Instead, KEQs define what the project team seeks to learn,
and serve to guide the entire evaluation process.2
The available body of facts or information that furnishes proof, or ascertains truth
or validity. To determine with a high level of certainty that a measured or observed
Evidence
effect can be attributed to a specific cause or intervention (impact), quantitative
causal evidence—generated by high-quality research studies—is typically required.

1
This glossary has been adapted in part from the International Rescue Committee’s 2015 Monitoring for Action Glossary and
Outcomes and Evidence Framework and Measurement Glossary and from the Monitoring and Evaluation Technical Assistance
(META) Project’s Monitoring and Evaluation Glossary and other project materials. Updated March 2019.
2
Adapted from BetterEvaluation, Specify the Key Evaluation Questions (KEQs), 2016.
Evidence-
Programs where existing evidence has been used to inform the theory of change
based programs
and implementation.
Evidence- Programs accompanied by research to build a body of evidence about an
generating intervention’s outcomes, cost-effectiveness, cost-efficiency, and/or implementation
programs processes.
A data collection method that involves a structured interview with a small group of
Focus group
6-12 people. A moderator asks respondents both standardized and follow-up
discussion
questions to collect information about their experiences, feelings, and preferences.
A statement of the overall improved situation the project will contribute to, but not
Goal
obtain on its own.
The measured effect of an intervention on the outcome for the client population.
Impact
Changes in an outcome that are attributable to the intervention.3
A quantitative or qualitative factor or variable that provides a simple and reliable
Indicator means to measure achievement or to reflect the changes connected to an
intervention.4 A variable that represents a valid measure of change.
Indicator
matrix (also
referred to as A table that builds upon a project’s logframe to specify targets and milestones and
M&E provide key detail related to the calculation, purpose, and use of indicators and the
framework/ disaggregation, source, and frequency of data collection.
measurement
framework)
A data collection method consisting of one-on-one conversations with specific
Individual In-
individuals who have knowledge about a topic of interest. Typically use an open-
Depth
ended (semi-structured) format, allowing interviewer to ask follow-up questions
Interviews
and probes to pursue topics in depth.
An individual makes a voluntary decision to participate in a research project or
Informed evaluation based on a genuine understanding of their ability to opt out; what data
Consent will be collected and how; how their data will be used, shared and protected; and
potential risks.
Resource provided for program implementation. Examples include money, staff,
Input
time, facilities and equipment.5
Internal project A process of reflection (via meeting, workshop, or other form of consultation) at the
evaluation/ end of key points in the project (end of year, end of funding period, or end of
learning project) to reflect on successes and challenges and capture project learning to
review inform the program and/or future programs and refine programmatic best practice.
Knowledge
Capturing, developing, sharing, and effectively using organizational knowledge.
management
Logical A table or matrix that summarizes the key elements of a project strategy: the
Framework project objective, intended outcomes, planned outputs, and major activities. It
(also referred outlines indicators that will be used to measure progress, the source of data, and
to as Logframe) assumptions necessary for project success. A logframe is a type of logic model.
A graphic representation that defines all building blocks required to bring about a
Logic model given long-term goal. The change process depicted in a logic model (whether
implicit or explicit) should be possible to articulate using “if… then” statements.

3
Adapted from 3ie Impact Evaluation Glossary.
4
Ibid.
5
Adapted from USAID Glossary of Evaluation Terms.

2
Both theories of change and logframes can be considered types of logic models,
which may contribute to the fact that the term logic model is often used
interchangeably with both terms.

The term theory of change refers specifically to a type of logic model


that explicitly illustrates (often with arrows) the causal pathways between activities,
outputs, outcomes, and objectives. In a logframe, which presents program
objectives, outcomes, outputs, activities, indicators, data sources, and assumptions
in a table or matrix, causal pathways are implicit. Illustrating a theory of
change before developing a logframe can help to make sure that the causal
pathways implicit in a logframe are based on sound logic.
A narrative document that serves as a planning tool and reference for project staff
for how to go about monitoring and evaluation throughout the life of their project.
M&E plan It outlines what will be monitored and evaluated and why, how data will be
collected and protected, how data will be analyzed and shared, the staff
responsible, and other resources needed.
The process of regularly and systematically collecting and analyzing information
about a project and, when appropriate, using it to make adjustments to the project.
Monitoring Project monitoring data may be used to adjust project implementation, enable
internal and external reporting, inform project design and advocacy, and promote
accountability to clients.
A spreadsheet that lists all M&E-related activities, including all data collection
activities, data verification activities, analysis and action planning meetings, staff
training on data collection tools and data management, reporting activities, and
end-of-project evaluations or learning reviews. It describes the staff responsible for
the activity, when the activity will be conducted and with what frequency, and what
M&E workplan
tools will be used.

The M&E workplan is a management tool used on an ongoing basis throughout the
project lifecycle to ensure that M&E activities are conducted and/or that changes to
planned M&E activities are documented.
An assessment conducted to identify the priority needs of the target population in
order to design and deliver timely and effective services. Needs assessments take
many forms, but typically involve the systematic collection of the following
information: an estimate of how many people are affected; safety risks; priority
Needs
locations; priority needs according to the target population; gaps in service
assessment
availability or access; cultural concerns or contextual factors that may influence
needs, vulnerability, service delivery, or access; and best methods of service
delivery. Where appropriate, data should be disaggregated by gender, age and
other characteristics that may influence needs or access to services.
A statement of the condition or state one expects to achieve. The part of the
Objective
overall purpose or goal that the project will achieve.
Technique for watching or looking at an event, process or place to get information
about it. It may be structured or unstructured. The observer uses a form, checklist
Observation
or other tool to guide what he or she is looking for, documents what he or she
sees, and analyzes the notes.
The planned or achieved results of an intervention’s outputs; changes that
Outcomes
contribute to the project’s overall objective.
The products, goods, services, and immediate results produced directly by the
Outputs
project and that are required for achievement of the project’s outcomes.

3
Process/ Systematic collection and analysis of data to determine whether a project is being
implementation implemented as planned and is reaching the intended population with the intended
monitoring services, and using that data to steer or shape project implementation processes.
Progress Systematic collection and analysis of data to assess progress in meeting project
monitoring goals.
Information which explains what is being studied with words (such as observations,
descriptions and perceptions). It can often be used to answer broad questions like
Qualitative
“why?” “how?” and “under what circumstances?” It can often explore opinions,
information
feelings and priorities and help explain behaviors or beliefs, offering depth of
understanding on a given topic.6
Information which measures what is being studied with numbers (such as counts,
Quantitative ratios, percentages and proportions). It can often answer narrowly defined
Information questions, and can help to explain the nature, size, frequency, and distribution of a
problem.7
A systematic investigation conducted to contribute to or fill a gap in evidence or
Research
generalizable knowledge.  Research is intended to test a theory or hypothesis.
A subset of a whole population selected to study and inform conclusions about the
Sample
population as a whole.
The SMART criteria are well-accepted in the field of M&E as standards against
which to measure the quality of an indicator. The letters of the acronym have been
SMART defined in different ways. Common terms used when explaining the SMART criteria
include: Specific; Measurable; Attainable, Appropriate or Attributable; Relevant,
Realistic, Reliable; and Timebound.
A person or group affected by or having an interest in the project or program and
Stakeholder who may affect or be positively or negatively affected by the project or program’s
implementation and outcome.
The systematic collection of information from a defined population. A survey may
be self-administered (respondents complete a paper-and-pencil or electronic
Survey
questionnaire) or enumerated (administered through in-person or phone interviews
conducted by someone trained to record responses).8
A type of logic model that defines all building blocks required to bring about a given
Theory of long-term objective and provides a graphic representation of the change process,
change explicitly illustrating the causal pathways between activities, outputs, outcomes,
and objectives
The extent to which the data collection strategies and instruments measure what
Validity
they purport to measure.

The IRC received $1,194,063 through competitive funding through the U.S. Department of Health and Human Services, Administration
for Children and Families, Grant # 90RB0052. The project will be financed with 100% of Federal funds and 0% by non-governmental
sources. The contents of this document are solely the responsibility of the authors and do not necessarily represent the official views
of the U.S. Department of Health and Human Services, Administration for Children and Families.

6
Adapted from International Federation of Red Cross and Red Crescent Societies Project/Programme Monitoring and Evaluation
(M&E) Guide, 2011.
7
Ibid.
8
Ibid.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy