Cor PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

betterevaluation.

org

Collaborative Outcomes Reporting


The steward for this approach is Jess Dart, from Clear Horizon,
the creator of Collaborative Outcomes Reporting (COR). This
description of the approach has been authored by Jess and
Megan Roberts.

Collaborative Outcomes Reporting (COR) is a participatory approach to impact evaluation


based around a performance story that presents evidence of how a program has contributed
to outcomes and impacts, that is then reviewed by both technical experts and program
stakeholders, possibly including community members.

Developed by Jess Dart, COR combines contribution analysis and Multiple Lines and Levels
of Evidence (MLLE), mapping existing data and additional data against the program logic to
produce a performance story. Performance story reports are essentially a short report about
how a program contributed to outcomes. Although they may vary in content and format,
most are short, mention program context and aims, relate to a plausible results chain, and
are backed by empirical evidence (Dart and Mayne, 2005). The aim is to tell the ‘story’ of a
program’s performance using multiple-lines of evidence.

COR adds processes of review by an expert panel and stakeholders, sometimes including
community members, to check for the credibility of the evidence about what impacts have
occurred and the extent to which these can be credibly attributed to the intervention. It is
these processes of review - outcomes panel (a type of expert panel review ) and summit
workshop (a collaborative approach to developing outcomes that differentiate COR from
other approaches to outcome and impact evaluation).

Steps
COR uses a mixed method approach that involves participation of key stakeholders, generally
in 6 process steps. Participation can occur at all stages of this process.

2 5
Data Outcomes
1 trawling 4 panel
Scoping: Data
Inception analysis &
and integration
planning 3 6
workshop Social Summit
inquiry workshop
1. Scoping

An inception/planning workshop is held. In this workshop the program logic is clarified,


existing data is identified and evaluation questions developed.

2. Data trawl

Can include both primary and secondary data sources. Generally, a data trawl of existing
evidence is undertaken. Program staff may be enlisted to help with the collation of data.

3. Social inquiry

Social inquiry can include any form of data gathering- qualitative or quantitative. If qualitative,
volunteers who are given a short training session in interviewing and an interview guide can
conduct interviews. This is a very effective way to involve staff in the data where there is
sufficient enthusiasm around the process. Otherwise consultants or the evaluation managers
conduct all or a proportion of the interviews. In many COR examples, the Most Significant
Change (MSC) technique is used at some point in the social inquiry process as a way of
capturing stories of change, both expected and unexpected.

4. Data analysis and integration

Quantitative and qualitative data can be analysed together according to the outcomes in the
program logic. A “results chart” is often used to integrate different sets and types of data.

5. Outcomes panel

People with relevant scientific, technical, or sectoral knowledge are brought together and
presented with a range of evidence compiled in step 4. They are then asked to assess the
contribution of the intervention towards goals given the available knowledge and to explore
rival hypotheses that could explain the data. It can be substituted for a citizen’s jury.

6. Summit workshop

At a large workshop key findings and recommendations are synthesised, and examples of
changes are identified and added (using material from MSC if available, and MSC processes
to select the most significant stories). The summit should involve broad participation of key
stakeholders such as program staff and community members.

7. Collaborative Outcomes Report structure

The report aims to explore and report the extent to which a program has contributed to
outcomes. Under COR, reports are short and generally structured in terms of the following
sections:

• A narrative section explaining the program context and rationale.

• A ‘results chart’ (See FAQs) summarising the achievements of a program against


a program logic model.

• A narrative section describing the implications of the results e.g. the


achievements (expected and unexpected), the issues and the recommendations.
Collaborative Outcomes Reporting

• A section which provides a number of ‘vignettes’ that provide instances of


significant change, usually first person narratives.

• An index providing more detail on the sources of evidence.

COR can be applied across multiple sectors or scales of evaluation. The approach can be
particularly useful when the evaluation does not have well defined outcomes at inception,
or if outcomes are emergent, complicated or complex. It has been used in a wide range
of sectors from overseas development, community health, and Indigenous education, but
the majority of work has occurred in the Natural Resource Management Sector, with the
Australian Government funding 20 pilot studies in 2007-9.

Mapping the approach to the Rainbow Framework​

MANAGE an evaluation or evaluation system

Develop evaluation capacity

Supervised practice in teams: COR can help build evaluation capacity because it has a specific
mandate for involving project teams and staff in the social inquiry or data gathering phase.

DEFINE what is to be evaluated

Develop program theory / logic model

Outcomes hierarchy: a key element in the COR approach is the development of a program
logic. This is used to map data and results and to tell the ‘performance story’ of the evaluation.
Generally an outcomes hierarchy has been used in the examples below.

DESCRIBE activities, outcomes, impacts and context

Collect/retrieve data

Project records: the data trawl step draws on project records and any other existing
monitoring data to inform social inquiry.

Combine qualitative and quantitative data

Integrated design: COR is well suited to combining multiple types of data, such as qualitative
and qualitative. This is partially due to COR’s foundations in MLLE, but also because COR uses
Expert Review and the Summit process to weave those data types together coherently and
meaningfully, mapped against a program logic.
UNDERSTAND CAUSES of outcomes and impacts

Check the results support causal attribution

A variety of options can be used to look at the patterns of results to see if they match what
would be expected if the intervention were producing the observed outcomes.

Investigate possible alternative explanations

By engaging diverse experts and other stakeholders in the process there is increased
opportunity for alternative explanations of the outcomes to be identified and investigated.

SYNTHESISE data from one or more evaluations

Synthesise data from a single evaluation

Expert panel: the outcomes panel engages content experts to individually undertake a
synthesis of evidence.

Consensus conference: the summit workshop provides an opportunity for diverse stakeholders
to come together to synthesise the evidence into an overall judgement.

Advice for CHOOSING this option


When reports are needed that provide both brief and clear messages and an easy audit trail
to the evidence that substantiates these claims

Organisations often place a high value on the reports because they strike a good balance
between depth of information and brevity and are easy for staff and stakeholders to
understand. They help build a credible case that a contribution has been made.

When there is a desire to include program staff and other stakeholders in the process and
develop their evaluation capacity: the participatory process by which reports are developed
offers many opportunities for staff and stakeholder capacity building

They are a great way to kick off a new monitoring and evaluation system, because they
involve synthesising and reflecting on all existing data and data gaps (a great platform to think
about what data is really needed!). A COR process can be valuable for garnering buy-in and
ownership for a program or evaluation process.

When the evaluation is intended to answer questions about the extent to which an
investment contributes to outcome

COR mainly focuses on answering this type of evaluation questions, and therefore should not
be seen as the only reporting tool. Other methods and approaches are needed to answer key
evaluation questions about the appropriateness of the investment or interventionor its cost-
effectiveness. COR is not designed to address these questions. They may be added into this
methodology in the future, but are not covered by this approach as it currently stands.
Collaborative Outcomes Reporting

When stakeholders’ values will be used as the evaluative criteria

COR is based on the premise that the values of stakeholders, program staff and key
stakeholders are of highest importance in an evaluation. The evaluators attempt to “bracket
off” their opinions and instead present a series of data summaries to panel and summit
participants for them to analyse and interpret. Values are surfaced and debated throughout
the process. Participants debate the value and significance of data sources and come to
agreement on the key findings of the evaluation.

When a program has emergent or complex outcomes that are not fully defined at the onset
of a program

For this reason a program logic is refreshed at the start of the evaluation process. In addition
qualitative inquiry is used to capture unexpected outcomes and deliberative processes are
used to make sense of the findings.

Advice for USING this option (tips and traps)


Including stakeholders needs delicate management and careful facilitation

Bringing stakeholders in at the wrong time can raise problems for the evaluation, particularly
in the summit phase. Ensuring that the voices of all stakeholders who will be at the summit
workshop or will have a hand in reviewing the report are included in the data collection phase
is important. This can be either through social enquiry or data trawling and document review.
If they aren’t included, the summit workshop can become more focused on data collection,
rather than its real purpose- to ground truth and formulate recommendations.

Focus the key evaluations carefully

Since no evaluation can answer all of the questions the program’s stakeholders may ask, so
it is really critical to prioritise questions, in the scoping phase. All participants should be clear
about what is being evaluated and what the evaluation will focus on.

Managing the outcomes panel needs consideration of conflicts of interest

Assembling the experts in the outcomes panel also requires careful consideration, to avoid
conflicts of interest. Whilst experts may, inevitably due to the nature of the evaluand, be
connected to the program in some way it is important that they did not design the program
themselves or were a part of it, to maintain neutrality. Furthermore, in highly political
evaluands, agreement on outcomes from a panel of experts may be fraught. In this case, an
expert evaluation may be a more appropriate approach than COR.

Ensure that the performance story and data collection do not only focus on positive aspects
of performance

CORs have been criticised for being too appreciative, or for being incapable of telling a bad
story. While this is certainly a risk, the technique does attempt to address this in a number
of ways. Firstly all informants are asked to describe the strengths and the weaknesses of the
program. These weaknesses or issues are documented in the report. Secondly, the outcomes
panel is encouraged to report on negative as well as positive trends in terms of the outcomes.
So the “negatives” are not avoided in this process. However, the choice of topic for an
outcomes report is often purposeful rather than randomly selected.

COR can be done on a shoe-string budget: It’s possible to reduce the cost by making the
following changes to the process

Run inception meeting with planning meeting; conduct fewer interviews; limit time for data
trawl; combine outcomes panels with summit; only use secondary data; the commissioning
organisation takes responsibility for producing the final report. However, the one thing that is
absolutely pivotal is the Summit workshop. The summit is the space for tying the evaluation
together and weaving the perspectives together. If managed effectively, the summit ground-
truths the findings and the group develops recommendations. This is a core element of what
sets COR apart from other approaches to outcome and impact evaluations.

If the program already has a strong monitoring, evaluation, reporting and improvement
framework and process

You may not need all the elements of COR. If there is a clear program logic in place and a
comprehensive monitoring system you may only need to add some components of the COR
process, such as the outcomes panel and summit workshop. A COR process in it’s entirety is
well-suited in situations where there has been ad-hoc monitoring data collected or no initial
program logic developed and tested.

FAQ
Where did COR come from?

COR was inspired by John Mayne’s work on Contribution Analysis. Mayne articulated the need
to report against program logic and called this a performance story report (Dart and Mayne,
2005). COR took this concept and extended it in a stepped out process and specific reporting
product. Part of this was intentionally participatory and used MLLE.

What is a results chart?

A results chart can tell the story of change from a program through lines of evidence
generated from the data trawl and social inquiry process. It is a table that plots activities,
outputs and outcomes next to evidence collected.

Resources
Report on outcomes and get everyone involvedThe
Participatory Performance Story Reporting Technique

This paper outlines the background and philosophy of COR


and PSR.
Collaborative Outcomes Reporting

Developing a Performance Story Report

A nuts and bolts guide to developing a COR/PSR produced


by the Australian Government and Jess Dart. Includes
practical tips, step-by-step process guides and definitions of
key concepts.

‘Performance Story’

Description of ‘Performance Story’ by Jess Dart and John


Mayne in the ‘Encyclopaedia of Evaluation’.

Find more resources on Collaborative Outcomes Reporting online:

betterevaluation.org/plan/approach/cort

A special thanks to our contributors


Author and steward Author Reviewer
Jess Dart Megan Roberts Patricia Rogers

Managing Director, Research Assistant, Professor of Public


Clear Horizon Clear Horizon Sector Evaluation, RMIT
Consulting Pty Ltd. Consulting Pty Ltd. University. Melbourne,
Melbourne, Australia. Australia. Australia

BetterEvaluation is an international collaboration to improve evaluation by


sharing information about methods, approaches and options. BetterEvaluation was
founded by RMIT University, Pact, Institutional Learning and Change (ILAC), and
Overseas Development Institute (ODI).

Financial support has been provided by Rockefeller Foundation, International


Fund for Agricultural Development (IFAD), Department of Foreign Affairs and
Trade (DFAT), Ministry of Foreign Affairs of the Netherlands, International
Development Research Centre (IDRC).

You may use this document under the terms of the Creative Commons Attribution-Non
Commercial Unported licence available at http://creativecommons.org/licenses/by-nc/3.0/.

Citation: Dart, J., & Roberts, M. (2014) Collaborative Outcomes Reporting. BetterEvaluation.
Retrieved from http://betterevaluation.org/plan/approach/cort
betterevaluation.org

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy