Canada Measuring ESD

Download as pdf or txt
Download as pdf or txt
You are on page 1of 56

Defining and Measuring Success

In Canadian Public Sector


Electronic Service Delivery

Issue Paper Prepared for


Discussion and Debate at Lac Carling V

April 24, 2001

Professor Jerry Mechling


Charles Vincent
Defining and Measuring Success in Canadian Public Sector ESD

In e-government, performance measures should define progress and success, giving the
public reasonable assurance that the system is accountable to its customers. The
measures should be developed with stakeholders’ input and should be clearly
documented and well promoted. Without finite measurements and stakeholder input and
consensus of what those measurements will be, government risks having the project
assessed by arbitrary measurement standards.
NECCC report e-Government Strategic Planning: A White Paper1

As Lac Carling V approaches, the public profile of e-government and electronic service
delivery (ESD) has never been higher. Long the domain of technology-oriented policy
wonks, ESD today finds public officials from around the world racing to interact with
citizens and businesses through electronic channels.

How are we in the Canadian public sector doing in this race? Are we doing as well or
better than others – as Accenture suggests2 – or are we behind? Are we close to our
goals? What are those goals – and what should they be? Based on what we have learned
so far about the relationship between our actions and their results, what should we do
next?

These are important questions. To answer them, we need measures from points
throughout the complex value chain of inputs, processes, outputs, and outcomes that
governments – interacting with other institutions – use to produce and deliver electronic
services. We need measures of the social values affected by the ESD value chain –
values such as service effectiveness and efficiency, personal privacy, systems security,
community cohesiveness and equity, our capacity for democratic governance, and the
legitimacy of our governmental systems.

In short, we need to make good decisions about what to measure, how to measure, and
how to interpret those measures.

Defining and Measuring Success in Canadian Public Sector ESD Page 2


Professor Jerry Mechling and Charles Vincent April 24, 2001
This paper addresses the measurement challenge. Our purpose here is not to assess
whether Canada is succeeding with ESD, to gauge how different jurisdictions in Canada
compare with one another, or to judge how Canada compares to other jurisdictions.
Instead, this paper is written as the starting point for a discussion about how we can and
should define and measure progress on the ESD agenda. In setting the stage, this paper
begins by suggesting a fairly broad set of criteria for defining ESD success – arguing that
success means much more than simply achieving a set of targets or milestones. After a
survey of how different jurisdictions around the world are currently measuring progress
in ESD, the paper highlights “measurement gaps” between what is being measured today
and what ideally should be measured in assessing ESD success. The paper concludes by
exploring some practical ways governments might fill these gaps, elaborating and
evaluating options and presenting some recommendations for discussion and debate at
Lac Carling V.

Growing Interest in Electronic Service Delivery

Given Canada’s status as one of the most connected nations in the world (six in ten
Canadians report being online3 and almost one in two report Internet access in the
home4), the demand for ESD in Canada is not surprising. Research on e-government in
Canada from Ekos Research Associates5 suggests that more than half of Canada’s
Internet users have visited at least one government web site in the past three months.
Moreover, six in ten Canadian Internet users believe the Internet is an effective way for
governments to communicate with citizens about programs and services. Public opinion
is especially favorable toward the portal or “one-window” concept, where almost seven
in ten Canadian Internet users believe citizens should be able to apply for programs and
services from different levels of government through a single web site. Research from
Ipsos-Reid reveals a similar story of support and demand for interacting with government
online. According to a poll conducted in the spring of 2001,6 almost four in ten Canadian
Internet users believe that in the next five years the majority of their dealings with
government will be conducted over the Internet – and this number rises to almost sixty
percent among young Canadians.

Defining and Measuring Success in Canadian Public Sector ESD Page 3


Professor Jerry Mechling and Charles Vincent April 24, 2001
This narrative does not suggest that the road to e-government and ESD is paved with
gold. Public opinion research from Erin Research found that on a scale of 0 to 100,
respondents rated the ease of access to services delivered over the Internet at 62 – well
below “walk-in” service and equivalent to service by mail.7 Research also shows that
there are still significant differences in Internet usage based on age, education, income,
and to a lesser extent, gender.8 As the number of Canadians with high-speed access to
the Internet continues to grow,9 however, the demand for ESD is not likely to wane.

In the face of growing demand, Canadian governments have been busy working to offer
services through a variety of electronic channels, including interactive voice response
telephony (IVR), the Web, and electronic kiosks. At the federal level, the Government of
Canada is actively working toward a goal of having all government information and
services online by 2004, and has established the Government Online (GOL) initiative to
help drive toward this goal. Building on a strong ESD foundation that saw 5.7 million
tax returns filed using EFILE or TELEFILE in 2000 and 350,000 jobs posted on the
government’s online Job Bank10, early GOL activities include a user-focused redesign of
the main Government of Canada portal, building of a common IM/IT infrastructure, and
funding of a series of “Pathfinder” projects to accelerate learning.

At the provincial and territorial level, a new focus on ESD is also becoming evident.
Every province and territory in Canada now has an office or agency responsible for
delivering services through new electronic channels.11 Appendix A highlights some of
the most interesting ESD activities of the past year, including Newfoundland’s motor
vehicle payment transaction engine that is integrated with the government’s backend as
well as the Royal Bank’s backend, Prince Edward Island’s Telehospice project that
connects homes of terminally ill patients to a nursing station at a local hospital allowing
patients to spend more time at home during their last days while monitoring vital data,
Manitoba’s day care calculator that will soon link parents to a database of available day
care spaces, and British Columbia’s OneStop business registration that is integrating
municipal and provincial efforts.

Defining and Measuring Success in Canadian Public Sector ESD Page 4


Professor Jerry Mechling and Charles Vincent April 24, 2001
Far from sitting idly by, some of Canada’s municipalities are also working to advance
their ESD agendas. On January 1, 2001 the city of Ottawa, with the help of a private
consortium lead by Deloitte Consulting and NIC, launched a brand new web portal,
complete with eight electronic services including the ability to pay parking tickets online
and register for municipal recreational activities. In Edmonton, the government
redesigned its web site to make it more user friendly and intuitive. They are also in the
process of putting all government forms online – including some that can be submitted
directly through the web site. In British Columbia, municipalities across the province are
working through a not-for-profit partnership to develop a shared platform or set of
components for ESD. In conjunction with the federal government’s Smart Communities
initiative, governments in Yellowknife, Charlottetown, and ten other communities are
advancing into the electronic age.

Less visible than these web-based applications, Canadian governments at all levels have
been actively working to establish the policies, legal frameworks, and technology
infrastructures that will make electronic service delivery possible. For example,
information policies that cover privacy, access to information, and security, legislation to
enable digital signatures and protect personal information, and technology infrastructures
to support secure transactions are all on the agenda.

With all this apparent activity, it is appropriate for us to ask the questions: Are we
succeeding in advancing an ESD agenda, and if we are, how do we know? Are we
delivering the right services at the right pace? What measures are important in charting
progress? These and a myriad of related questions are floating around without any
definitive answers.

Defining and Measuring Success in Canadian Public Sector ESD Page 5


Professor Jerry Mechling and Charles Vincent April 24, 2001
Conceptual Framework – Control in Complex Systems

How government pursues the goal of ESD is, at its heart, a public policy decision. In this
context, it is useful to recall Stokey and Zeckhauser’s12 classic five-step analytic
approach to public decision making: establish the context, lay out the alternatives, predict
the consequences, value the outcomes, and make a choice. Sound public decision-
making requires the policy maker to understand the consequences of different alternatives
in a given situation and make a choice based on a set of social preferences.

Inherent in this analytic approach is a definition of success. Success is achieved when the
policy choice taken produces an outcome that maximizes the welfare of society, where
the welfare of society is defined broadly as the sum of each citizen’s individual welfare.
While conceptually straightforward, the maximization of social welfare in a complex
system is much more difficult to define or achieve in practice. First, how do we define
social welfare and what factors affect social welfare? For example, we know that in
pursuing ESD we want to improve the quality of service, ensure the privacy and security
of data, increase levels of customer satisfaction, and raise the public perception of
government. But, we also want to provide equitable access to services, integrate service
across delivery channels, improve the accuracy of data collection, and stimulate e-
commerce. Do these objectives represent all of the factors that affect social welfare?
Clearly no; even a cursory review of statements made by governments when launching
ESD initiatives reveals a myriad of perceived benefits – from accessibility to economic
development, the list is very long. Furthermore, the complex and often competing nature
of our values necessitates trade-offs among them. Aggregating measures of these
individual values into an overall measure of social value that would help in prioritizing
actions is not easy when individual preferences differ and are in conflict. Yet, if we are
to assess the success of ESD initiatives we must have some idea of what success means.
While a number of tools have evolved over the years to help with the challenge of
identifying social values, the practical exercise is necessarily an incomplete
approximation.

Defining and Measuring Success in Canadian Public Sector ESD Page 6


Professor Jerry Mechling and Charles Vincent April 24, 2001
Assuming we can identify the goals of ESD, the measurement problem that lies at the
heart of this paper is how we can assess our actions and progress toward these goals.
Figure 1 offers a very simple systems perspective of government actions in pursuit of
ESD. After analyzing possible action steps, government decides on the best course of
action and then acts. Taking stock of what was accomplished, government then measures
to see if those actions took us closer to our goals, starting the cycle again by analyzing
possible actions steps in light of measured results from the previous cycle.

Figure 1: Simplified Systems Perspective

Analyze
Possible
Actions

Decide on
Measure Progress Best Course
Toward Goals of Action

Act

This model is obviously a simplified version of reality. In particular, it is important to


recognize that measurement of complex systems or value chains involves a great deal of
uncertainty – uncertainty about the cause and effect relationship between action and
result, uncertainty about the measurement instruments, and uncertainty about the scales
used for comparisons and tracking. Measurement of ESD is especially challenging since
the boundaries of the system or value chain are not neatly defined. Delivering services to
citizens through electronic channels involves complex and interdependent relationships
between different agencies, different jurisdictions, and different sectors (both public and
private). The fundamental measurement challenge remains, however, to use

Defining and Measuring Success in Canadian Public Sector ESD Page 7


Professor Jerry Mechling and Charles Vincent April 24, 2001
measurement as a tool in managing systems – in controlling systems as best we can – so
that each action moves us closer to our goals.

While the complexity of the system makes measuring the impact of ESD challenging,
government cannot shy away from the measurement exercise. As the saying goes, “You
only get what you measure.” If government ignores factors that are difficult to measure,
it will not get an accurate assessment of the progress made.

What is Success in ESD? Understanding the Goals for ESD

How do we define success in ESD? What is the goal we are seeking to attain? Looking
at the ESD landscape in Canada and elsewhere it is clear that the overriding focus is on
making services available through new electronic channels in a timely manner so as to
encourage citizens to migrate from traditional delivery channels. What was once a mix
of pilot projects and experiments has taken a more organized form as governments from
Australia to Europe to North America are setting public targets in an attempt to get from
here (limited or first generation ESD) to there (full or at least robust second generation
ESD).

The earliest example of public targets came from the federal government in Australia. As
early as 1997, Prime Minister John Howard announced that the government “is
committed to… delivering all appropriate Commonwealth services electronically on the
Internet by 2001.13

Since then, governments from the United Kingdom, to the Netherlands, to the United
States have set their own ESD targets (See Box 1 for a sample of ESD targets).

Governments in Canada have also begun setting public targets. In 1999, the Government
of Canada announced that, “By 2004, our goal is to be known around the world as the
government most connected to its citizens, with Canadians able to access all government
information and services online at the time and place of their choosing."14 Shortly

Defining and Measuring Success in Canadian Public Sector ESD Page 8


Professor Jerry Mechling and Charles Vincent April 24, 2001
thereafter, the Ontario government announced that “By 2003, the Ontario government
will be a world leader in delivering services online” 15 and the Government of
Saskatchewan adopted milestones to make all forms available online by 2002 and 90% of
all transactions available through electronic channels by 2004. Similarly, the Manitoba
government notes on its web site that “By the year 2003, Manitoba’s Web portal may be
the preferred mode of interaction by our citizens.”16
Box 1: Sample of ESD Targets Set by Governments
Australia To deliver all appropriate Commonwealth services electronically
on the Internet by 2001
Canada To access all government information and services online by
2004
European Union# Member states to provide generalized electronic access to main
basic public services by 2003
Finland* To ensure a significant proportion of forms and requests can be
dealt with electronically by 2001
France* To provide public access to government services and documents
by the end of 2000
Germany^ To make all services available electronically by 2005
Ireland^ To have all but the most complex of integrated services by the
end of 2001
Japan* To make all applications, registrations, and other administrative
procedures between the people and the government available
online using the Internet or other means by fiscal year 2003
Netherlands* To have 25% of public services delivered electronically by 2002
Province of Ontario To be a world leader in delivering services online by 2003
Province of Saskatchewan To have all forms online by 2002 and 90% of transactions by
2004.
Singapore* Where feasible, to have all counter services available
electronically by 2001
UK* To have 100% of government services carried out electronically
by 2005
USA* To provide public access to government services and documents
by 2003. To provide the public with an option to submit forms
electronically by 2003
Province of Victoria, To have all suitable government services online by the end of
Australia 2001
Sources: *Information Age Government: Benchmarking Electronic
Service Delivery, UK, July 2000.
^ Europe’s Readiness for e-government, Kable, 2000.
# eEurope 2002: An Information Society for All, Council of the
European Union, June 2000.
Note: All others from respective web sites.

Defining and Measuring Success in Canadian Public Sector ESD Page 9


Professor Jerry Mechling and Charles Vincent April 24, 2001
Using targets or milestones as drivers for change in the public sector’s pursuit of ESD has
been widely recognized as a best practice and should be seriously considered by all
governments. Simply reaching these targets, however, does not adequately define
success. If we made every service available electronically but in doing so maintained the
existing agency silos, then many would argue that we failed. If we made every service
available electronically but only the young, the rich, and the educated used the services,
then many would argue that we failed. If we made every service available electronically
but decided not to work with any other jurisdictions or partners, then many would say
that we failed. ESD success requires government to do much more than simply reach a
target. Social welfare will not be maximized simply because government makes all
services available through electronic channels by a certain date.

Looking at public statements made by governments in launching ESD initiatives, it is


clear that the mere existence of electronic services is not the singular goal and does not,
in and of itself, define success. In fact, the list of objectives one can derive from these
public statements is extremely long. Breaking them into four categories, the objectives
include:

Better Service: Citizens have come to expect efficient services delivered when and where
they want them. A study by Momentum Research Group found that citizens perceive
convenience, speed, and time saving as the primary benefits of ESD.17 The majority of
Canadian Internet users (69%) view the Internet as simplifying access to government
programs and services, and seven in ten believe Canadians should be able to apply for
programs and services from different government departments through a single site.18
Most of all, however, citizens expect choice – choice in how and when they interact with
government, and governments are trying to respond.19 As Erin Research notes, “citizens
expect government services to be as good, if not better, than what they can get from the
private sector.”20

More Efficient Government: While rarely measured, the biggest driver for many
governments pursuing ESD is cost savings. With governments being asked to deliver

Defining and Measuring Success in Canadian Public Sector ESD Page 10


Professor Jerry Mechling and Charles Vincent April 24, 2001
more and better service while simultaneously cutting or freezing budgets, the bridge
connecting these two seemingly opposing forces is ESD. Self-service, increased
accuracy, fewer hand-offs, and shared databases are all supposed to help lower
transaction costs, increase productivity, and produce better government – all within a
secure records-management environment. At the same time, many governments believe
ESD will enable a more transparent and accountable government able to make
information available to citizens and more accurately measure performance.

Reinventing Government: Improving the performance of government is laudable, but for


many public sector officials the real value is in reinventing how government relates to
society – redefining the citizen’s relationship to government decision-making,
redesigning workflow across organizational boundaries, understanding government
service delivery from an enterprise perspective, building partnerships with other
governments to deliver seamless services in a way that makes organizational structure
irrelevant to citizens, and working with non-governmental organizations to explore new
and better ways to deliver public services. ESD offers the potential to reach each of these
objectives.

Benefits to Society: Aside from the direct benefits of delivering services through new
electronic channels, many governments also recognize that ESD can contribute toward
other government objectives. Examples cited by governments include improved
economic development, improved perception of and relevance of government, and more
citizens participating in the knowledge society.

Defining and Measuring Success in Canadian Public Sector ESD Page 11


Professor Jerry Mechling and Charles Vincent April 24, 2001
Figure 2: Categorized List of ESD Goals/Objectives

Better Service: More Efficient Government:


24/7 availability; improved quality of Lower transaction costs; improved
interaction; convenient access to services; productivity; greater accuracy; improved
efficient access to services; equitable communication between staff; greater
access for all citizens; user friendliness; interoperability; greater accountability;
consistency of service; choice of channels; transparency; reduced duplication; more
meet public expectations for service. effective records management.

Reinventing Government Benefits to Society


Working with governmental and non- Stimulate e-commerce; economic growth;
governmental partners to offer seamless augment government relevance; more
service delivery; rethinking workflow to Canadians using the web; build national
match citizen needs; redefining community; support least advantaged in
organizational structure – including citizen society.
relationship to governmental decision-
making.

Two things are important to note about the objectives listed in Figure 2. First, while there
is overlap in the objectives cited by different governments, the goals of any policy maker
must be consistent with the social values of the governed. Therefore, this paper does not
attempt to define ESD success in universal terms, but rather it suggests some criteria for
success based on the experiences of leading jurisdictions, leaving specific choices to
those who must tailor their decisions to the conditions of particular jurisdictions and
constituencies. Second, and most important, any definition of success must clearly be
broader than simply achieving ESD targets and milestones; while targets and milestones
serve as important drivers, they do not wholly define success. Success is achieved when
the right services are available through the right channels in a way that maximizes social
welfare.

Measuring Success: How is ESD Being Measured Today?

With a broad understanding of what success might look like, how can we find out if
government is on the right track? This question lies at the root of why measurement is so
important. In order to reach its goals, an organization must understand where it is today
and what affect its current actions have on the pursuit of these goals. Measurement helps

Defining and Measuring Success in Canadian Public Sector ESD Page 12


Professor Jerry Mechling and Charles Vincent April 24, 2001
clarify the direction an organization is headed with its current set of actions – is
government moving toward its goals? – and the direction an organization should move in
to reach its goals. Measurement also allows us to hold people accountable for progress.

Knowing that measurement is important, the challenge is to design a measurement


framework that will be help government assess its progress and set a course. As a first
step, it is instructive to explore what measurement frameworks exist today for ESD in the
public sector. Broadly speaking we see six kinds of measurement and assessment
frameworks.

1) Stages of E-government

Probably the most popular measure of ESD success is to chart government’s position or
status against a defined timeline, a set of milestones, or a theoretical model charting the
path to ESD maturity. These targets represent a natural starting point to motivate action
and are a way to galvanize support.

For example, the Government of the United Kingdom produces regular Electronic
Service Delivery Reports that track the number of services to citizens or businesses that
will be ready for electronic delivery by the government deadline of 2005.21 Similarly, the
Government Online reporting framework in Australia requires all agencies to report on
their progress toward meeting the government’s target of having all services online by the
end of 2001.22 In both cases, the quantitative number is derived from reports submitted
by each ministry or agency.

Also pursuing a fixed date, the state government in Victoria, Australia, has set eight
intermediate targets to track progress en route to achieving its goal of having 100 percent
of services available online by 2001 (see Box 2). The accompanying reporting system is
a “traffic light” system in which agencies indicate whether they are “green” (on target),
“yellow” (falling behind), or “red” (behind schedule) for each target date and each
service. Included with each rating is a brief summary outlining recent activities and

Defining and Measuring Success in Canadian Public Sector ESD Page 13


Professor Jerry Mechling and Charles Vincent April 24, 2001
explaining any problems. Other information captured in these reports include volume of
transactions online and budget information. Reports are made to cabinet quarterly.23
Box 2: Government of Victoria, Australia, Intermediate ESD Targets
Mid-1998
1) Complete a full audit of customer transactions and public information, and develop an
Electronic Service Delivery (ESD) strategy

31 December 1998
2) All Government tenders available on the Internet
3) All public forms electronically accessible
4) High-volume printed public information available on the Internet

31 December 1999
5) All Government publications on the Internet

30 April 2000
6) High-volume public transactions online

31 December 2001
7) All Government purchasing online
8) All transactions online

Source: Multimedia Victoria, Government Online Cabinet Progress Reporting System:


Guidelines, June 1999.

More recently, the Government of the United Kingdom has also started working with
local governments to help set targets and milestones and to measure progress in ESD. In
helping kick-start ESD at the local level, the UK government has proposed a series of
internal milestones. These milestones include designating an authority, giving staff
access to email, gaining shared access to data, and developing marketing plans. While
encouraging local government to develop more specific targets for ESD in each locality,
the UK government has thus far left those targets to local authorities.24 The one area
where the government is measuring progress is with an addition to the Best Value
performance framework – a performance framework that requires local authorities to
report on how they are continuously improving their services. Best Value performance
indicator 157 is focused on “the number and types of interactions that are enabled for
electronic delivery as a percentage of the types of interactions that are legally permissible
for electronic delivery.”25

Defining and Measuring Success in Canadian Public Sector ESD Page 14


Professor Jerry Mechling and Charles Vincent April 24, 2001
Putting these tracking frameworks in a broader context, most of the major consulting
firms have drafted “maps” that track a government’s progress through stages of ESD
maturity. Examples include GartnerGroup’s Four Phases of e-government,26 Deloitte
Consulting’s Six Stages of e-government,27 KPMG’s five stage Evolution of a Portal,28
IBM’s Seven E-Government Leadership Milestones,29 Accenture’s five stage
eGovernment Continuum,30 and ONCE’s Six Steps of E-Government Evolution.31 Rather
than simply tracking the number of services online, these frameworks focus on the
transformational power of e-government and ESD, mapping electronic services on a
spectrum that runs from simple informational services to fully integrated transactional
services that have transformed government.

The Australian National Audit Office (ANAO) is using a similar model in assessing
whether government agencies will be ready to deliver all appropriate services by the end
of 2001. As Box 3 illustrates, the ANAO’s Stages of Internet Service Delivery give
overseers and the public a better sense for how advanced agencies are in delivering
services online.
Box 3: Australian National Audit Office Stages of Internet Service Delivery
The ANAO identified four stages of Internet service delivery. Of the service due to be in
place by 2001, the ANAO determined that:

• 52 per cent would be at Stage 1, at which an agency had a Website that published
information about itself and its services;

• 25 per cent would be at Stage 2, at which an agency allows Internet users to access the
agency database(s), and to browse, explore, and interact with that data;

• 21 per cent would be at Stage 3, at which an agency allows users access as in Stages 1
and 2 and also permits them to enter secure information, and engage in transactions with
the agency; and

• 2 per cent would be at Stage 4, at which, in addition to the level of access permitted at
Stage 3, the agency, with the user’s prior approval, shares with other government
agencies’ relevant information provided by that user with a view to providing a whole-of-
government integrated service.

Source: Australian National Audit Office, Electronic Service Delivery, Including


Internet Use, by Commonwealth Government Agencies, 1999.

Defining and Measuring Success in Canadian Public Sector ESD Page 15


Professor Jerry Mechling and Charles Vincent April 24, 2001
These e-government maps that highlight the transformational nature of ESD also suggest
that the challenge for government is not simply to get services online, but to do it in a
way that fundamentally alters the relationship between government and society. If we
simply deliver services electronically without keeping the transformational opportunities
in mind we will end up with a disparate collection of online services that are unable to
integrate services across boundaries in the way citizens are coming to expect. It is critical
that in using measurement to advance the ESD agenda we do so in a way that emphasizes
and promotes activities and outcomes that support the kind of inter-program and inter-
jurisdictional collaboration required to achieve these transformational goals.

As Accenture’s study Rhetoric vs. Reality points out, governments are a long way from
achieving this vision. “Single sites, or portals, which allow citizens and businesses to
seamlessly interact with several government departments at one visit, have not yet
emerged as the dominant model.”32 While the breadth of service offerings is growing,
there is still not much depth.

Clearly these maps are simplified representations of the road from passive web presence
to a future transformed by fully integrated ESD. In fact, many would argue that this road
is anything but linear and that the move to later stages of maturity requires a distinctive
break with the past. While far from a precise measure of success, these tracking
frameworks are useful indicators of where governments stand in relation to their high-
level goals and with respect to private sector thought leadership. Without undertaking a
thorough examination of where different Canadian jurisdictions might lie on a map of
ESD maturity, it seems clear that different governments are at different stages of
development across the country. While governments are beginning to adopt a broader
perspective with respect to ESD, efforts to date has been largely centered on individual
applications and not on the kind of cross-boundary collaborative work that will be
required to seamlessly integrate service delivery and transform government.

Defining and Measuring Success in Canadian Public Sector ESD Page 16


Professor Jerry Mechling and Charles Vincent April 24, 2001
2) Internal and External Benchmarking

Benchmarking – The process of improving performance by continuously


identifying, understanding, and adapting outstanding practices and processes
found inside and outside the organization.
APQC. Benchmarking: Leveraging Best-Practice Strategies, 1999.

The United Kingdom has been a leader in benchmarking their electronic service delivery
efforts. In March 2000, it released an interim report that outlined its strategy for
measuring progress.33 In this document the UK explicitly rejects using a rigid scorecard
approach, noting that different political, cultural, and legislative environments make such
comparisons useless. Instead, they adopted a relatively qualitative approach.

Using a four-part template (below), the UK government periodically compares their


progress against those of G7 and other leading nations. The template focuses on citizen
demand for electronic services – especially methods for determining demand,
government supply of electronic services (front-end applications), government capacity
to deliver electronic services (back-end capacity), and government processes to drive and
manage change. It is interesting to note that in designing their framework, the UK
explicitly avoids any cost/benefit analysis of the investment in ESD as it is deemed to be
too early to identify savings.

Demand Supply
(consultation with citizens and businesses) (electronic government services)
- propensity to use ESD - service availability
- availability - linking services enterprise wide and
- desired services across agency borders
- adoption rates
Change Capability
(commitment and drivers of change) (enabling government infrastructure)
- route maps - authentication
- monitoring progress - data standards
- organization - security
- targets - privacy / records management

Defining and Measuring Success in Canadian Public Sector ESD Page 17


Professor Jerry Mechling and Charles Vincent April 24, 2001
Using this template for the first time in July 2000, the UK focused on establishing what
others were doing, and comparing that with activities in the UK.34 The template’s four
categories structure the comparison, which provides an overview of ESD developments
in a variety of countries. Based on this qualitative survey, the report then identifies
lessons to support the UK’s ESD efforts.

Manitoba is in the early planning stages of a similar benchmarking effort. Like the UK,
Manitoba intends to focus its efforts on peer jurisdictions as well as “best of breed”
jurisdictions. Early drafts of Manitoba’s benchmarking scorecard consider global or
enterprise measures such as existing infrastructure and security measures, as well as
service-specific measures such as accessibility and functionality.

Member states of the European Union have also decided to use a benchmarking effort to
assess their ESD progress. Agreeing to a list of 20 basic public services (see Box 4), the
EU is measuring progress toward delivery of these services using a four-stage maturity
model: posting of information, one-way interaction, two-way interaction, and full online
transactions including delivery and payment.

Defining and Measuring Success in Canadian Public Sector ESD Page 18


Professor Jerry Mechling and Charles Vincent April 24, 2001
Box 4: European Union’s Common List of Basic Public Services
Public Services for Citizens
1. Income taxes: declaration, notification of assessment
2. Job search services by labour offices
3. Social security contributions (3 out of the following 4):
§ unemployment benefits
§ family allowances
§ medical costs (reimbursement or direct settlement)
§ student grants
4. Personal documents (passport and driver's licence)
5. Car registration (new, used, and imported cars)
6. Application for building permit
7. Declaration to the police (e.g. in case of theft)
8. Public libraries (availability of catalogues, search tools)
9. Certificates (birth, marriage): request and delivery
10. Enrolment in higher education/university
11. Announcement of moving (change of address)
12. Health related services (e.g. interactive advice on the availability of services in
different hospitals; appointments for hospitals)

Public Services for Businesses:

1. Social contribution for employees


2. Corporation tax: declaration, notification
3. VAT: declaration, notification
4. Registration of a new company
5. Submission of data to statistical offices
6. Customs declarations
7. Environment-related permits (incl. reporting)
8. Public procurement

Source: Commission of the European Communities, eEurope 2002 Impact and Priorities:
Communication from the Commission to the Council and the European
Parliament, March 2001.

3) Report against Strategic Objectives

Another common framework for assessing progress and measuring ESD success is to
report against a set of strategic objectives. Jurisdictions adopting such strategies identify
a set of high-level objectives and then track proxy measures for each objective.

Defining and Measuring Success in Canadian Public Sector ESD Page 19


Professor Jerry Mechling and Charles Vincent April 24, 2001
Australia’s Government OnLine team collects and analyzes data from periodic Online
Surveys submitted by agencies, assessing progress toward the eight strategic priorities set
out in Australia’s Government Online Strategy (see Appendix B). In addition to the
objective of getting all appropriate services online by 2001, these high-level priorities
include meeting established standards in the areas of privacy, authentication, metadata,
security, and accessibility. Proxy measures for these objectives include the percentage of
agencies compliant with W3C priority 1 standards for accessibility, the number of
services targeted at rural or regional Australia, and the percentage of government
suppliers paid electronically. It is worth noting that in undertaking this tracking exercise,
Australia chose to allow agencies to self-report their progress.

The Government of Ontario is in the planning stages of a similar performance


measurement strategy from a corporate level. Starting with the six objectives set out in
the report The Future is Here: A Progress Report on e-Government in Ontario, the
government is in the process of identifying appropriate performance measures (proxies)
for assessing progress toward these high-level objectives. Unlike the Australian’s self-
reporting methodology in which agencies report on their own progress, however,
Ontario’s effort will include both internal and external measures of progress (see
Appendix C).

4) Report to Legislative Body – Public Accountability

The most traditional framework for measuring and tracking success is to report progress
through an annual (or periodic) legislative process designed to maintain public
accountability. Such processes are often incorporated into the budget process so as to tie
strategic objectives with financial allocations.

In the United States, for example, agencies are required to submit an annual performance
plan as part of their budget submission.35 These plans contain performance goals and
indicators for the agency and are reviewed both by the executive Office of Management
and Budget (OMB) and by Congress. Performance indicators include both outcome and

Defining and Measuring Success in Canadian Public Sector ESD Page 20


Professor Jerry Mechling and Charles Vincent April 24, 2001
output goals as well as external and internal goals. While not focused solely on ESD, in
2001 the OMB explicitly directed agency leaders to set goals for increasing online
procurement and electronic government. When used in this way, this performance
measurement structure serves to highlight whether ESD is helping achieve an agency’s
strategic objectives. By including it as part of the budget submission there is a clear tie
between resource allocation and results.

In addition to the annual performance plans, in October 2000 U.S. federal agencies also
submitted plans to the OMB outlining how they will meet specific ESD requirements set
out in the Government Paperwork Elimination Act (GPEA).36 In its guidance for the
implementation of GPEA, the OMB requires agencies to report progress against their
plan on an annual basis.37

5) Evaluation

Evaluation is the systematic assessment of the operation and/or outcomes of a


program or policy, compared to a set of explicit or implicit standards, as a means
of contributing to the improvement of the program or policy.
Carol H. Weiss, Evaluation, p. 4.

A classic program evaluation effort often begins by drawing a logic model or program
theory model that visually connects the activities or inputs of a program with its outputs
and outcomes. The logic model explicitly details how actions lead to the realization of
objectives. Using this visual model, the organization can isolate actions, outputs, and
outcomes, choosing which points in the model would be most valuable to measure in
assessing the progress and/or success of the program. While this methodology is very
useful for assessing program success, the challenge is to figure out what points in the
chain are most important to measure. This challenge is especially difficult when
outcomes depend on collaborative efforts.

With the help of KPMG, the Government of Canada is in the process of establishing an
evaluation framework. Based on a logic model of the Government Online (GOL)
program, the federal government has isolated the activities, outputs, and outcomes

Defining and Measuring Success in Canadian Public Sector ESD Page 21


Professor Jerry Mechling and Charles Vincent April 24, 2001
(including client-level, department-level, and societal-level outcomes) that define success
for GOL. The model itself has proven to be extremely useful in understanding how
activities and outcomes are related. In choosing what aspects of the model to measure,
the government has explicitly decided not to measure activities or internal business
process, maintaining that they can be assessed at an activity-level by looking at whether
each activity was completed on time, within budget, and within scope. A decision was
also made to ignore societal-level outcomes in the measurement exercise since it would
be difficult to isolate GOL’s contribution to such outcomes. Instead, the measurement
focus is on client-level and department-level program outcomes and the program outputs
required to achieve these outcomes.

In reporting these measures, a high-level scorecard will be used that focuses on whether
outputs or outcomes are being achieved in the context of six client-oriented objectives:
responsiveness, convenience, ease-of-use, interactivity, security, and privacy.

Results Category
Level of Responsiveness Convenience Ease- Interactivity Security Privacy
Measurement of-Use
Outputs ü ü ü ü ü
Department- ü ü ü ü
level Outcomes
Client-level ü ü ü ü
Outcomes

6) Application-Specific Measures of Success

Each of the frameworks above focus on measuring the progress and success of ESD from
an enterprise or corporate perspective, addressing the question “Is the government
succeeding with ESD?” A more common approach to measuring success is to focus on
the results achieved by individual applications.

Interviews with individuals involved in implementing ESD applications suggests that at


an application-level, measures of success are focused in two areas: customer satisfaction
and usage (usually measured as take-up or adoption rates). While there are examples of

Defining and Measuring Success in Canadian Public Sector ESD Page 22


Professor Jerry Mechling and Charles Vincent April 24, 2001
governments looking at productivity gains and costs savings, these are few and far
between.

Customer Satisfaction: Easily the most common measurement category in ESD


applications, governments are spending a lot of energy finding out whether citizens are
satisfied with the new service delivery channels. For example, Service New Brunswick
interviews by telephone one out of every ten citizens who use their services. Similarly,
Service Nova Scotia, Ontario Business Connects, and British Columbia’s OneStop
Business Registration ask each user to complete a web-based survey after service
delivery. In fact, customer satisfaction has become so important to governments as a
measure, that a group of Canadian public sector employees developed the Common
Measurements Tool (CMT) as a vehicle to bring consistency to client satisfaction
measurement across time and between organizations.38 Recently an electronic version of
the CMT was developed and is being used in Ontario, Manitoba, and a number of other
jurisdictions. While public sector managers report relatively high levels of satisfaction
with specific ESD applications, they also admit that the measurement tools they use are
limited in scope. As we noted above, public opinion research from Erin Research found
that on a scale of 0 to 100, respondents rated the ease of access to services delivered over
the Internet at 62 – well below “walk-in” service and equivalent to service by mail.39 If
this is true then governments may be further from customer service goals than their
measures indicate.

Usage (Take-up or Adoption Rates): The most common way of measuring usage is to
calculate the number of transactions happening through electronic channels as a
percentage of all transactions – the take-up rate. These numbers can then be tracked over
time to see how quickly applications are adopted. In British Columbia they are even
breaking their adoption rate numbers down by community to better understand where
they need to focus their marketing attention.

Productivity: While much less common, some jurisdictions are attempting to measure
productivity improvements. For example, in Ontario it is estimated that by using the

Defining and Measuring Success in Canadian Public Sector ESD Page 23


Professor Jerry Mechling and Charles Vincent April 24, 2001
Ontario Business Connects platform to register a business, government can reduce
processing time from six weeks to twenty minutes, due in large part to reducing error
rates from nearly 50 percent to almost nothing.

Cost Savings: Cost savings are probably the must elusive (and some would argue
illusory) aspect of ESD to try and measure. Getting a clear understanding of the costs
associated with delivering services through different channels is very difficult. In Texas,
however, the state government recently experimented with activity-based costing (ABC)
with some moderate success. ABC is a methodology that traces the full range of direct
and indirect resources associated with delivering a service. While many of the savings
identified in the Texas pilot were savings in productivity – and were not easily
translatable into hard dollars – by accounting for all resources, including items such as
fringe benefits, overhead, and depreciation of equipment, ABC enabled the state to
identify hard dollar savings within the five pilot agencies. For example, the study
suggested that replacing the Department of Transportation’s call center with a web-based
system could save taxpayers more than $240,000 per year.

“Total Value” Return on Investment. In Iowa, ESD projects are assessed in terms of the
benefits to both government and citizens. In addition to “hard” costs and benefits (e.g.
hardware and staff time), Iowa’s ROI analysis estimates costs and benefits associated
with factors that are more difficult to quantify (and often ignored), including risks to
citizen health, impacts on security and safety, and the time and energy required by
citizens in fulfilling their roles in the process. By calculating ROI in this way, Iowa gets
a more complete picture of the value of each project.

Unfortunately, the efforts to measure aspects outside of customer satisfaction and usage
are not very common.

Defining and Measuring Success in Canadian Public Sector ESD Page 24


Professor Jerry Mechling and Charles Vincent April 24, 2001
The Measurement Gap: What are the Most Important Shortcomings in ESD
Measurement Today?

From our review of how jurisdictions around the world are defining and measuring
success in ESD, it is clear that the diverse activities involved in delivering services
through electronic channels constitute an extremely large and complex system of
interrelationships. Many different types of inputs are involved including labor,
information infrastructure, and stakeholder inputs to a variety of government planning
processes. These inputs interact through complex processes – some run by the
government and some by contractors – to create the immediate outputs of the electronic
services themselves. These services, in turn, interact with a variety of forces in society to
generate outcomes such as service take-up and penetration rates as well as the end results
that most directly affect social values – productivity, fairness in how different individuals
and groups are treated, privacy, and security.
Figure 3: Simplified ESD Model

Outcomes
Inputs Processes Outputs and
and Values
Activities

Costs (C) Benefits (B)

To manage complex systems we need to make estimates of the relative value to society of
various outcomes that can be created from the activities or resource-consuming options
we might pursue. The important judgment is to find the appropriate relationship between
benefits and costs, dedicating resources to produce desired outputs and outcomes.

A classic measurement problem in complex systems derives from the uncertain cause and
effect relationship between actions (Costs) and outcomes (Benefits). If we measure how
we are doing in achieving downstream values (outcome measures), we reduce uncertainty
of where we stand on achieving desired benefits but we may not be able to estimate the

Defining and Measuring Success in Canadian Public Sector ESD Page 25


Professor Jerry Mechling and Charles Vincent April 24, 2001
specific activities (C) that produced these results. Therefore, we may use such measures
to understand where we are collectively, but because too many intervening factors have
come into play, we cannot use them to hold individuals accountable for helping us get
there. When we go upstream (input/process measures) we get better measures of the
immediate results of particular activities and we can hold people accountable for
producing those results. Since the intermediate results are not where the value truly lies
(B), however, we still have a major problem of highly uncertain productivity estimates
(B/C).

In general, most measurement frameworks seek to manage these uncertainties through a


combination of upstream and downstream measures. This makes the measurement
frameworks themselves more complex, however, raising the cost of measurement and
diluting the value that comes from focusing attention on a few readily understandable
measures. The key is balance.

In that regard, how well balanced are the efforts to measure Canadian ESD efforts?

Let us start by summarizing our characterization of Canadian ESD measurement so far.


Most aggregate goals have been set in terms of service availability, penetration targets, or
phases in an expected progression of service characteristics. Measurement at the
disaggregated or application-level has also given heavy focus to customer service and
take-up rates. These are two dimensions of success, but certainly not the only dimensions
about which most citizens and governments care.

What we have not seen yet is much measurement of ESD inputs. How much is being
spent on developing and delivering ESD? We have also not seen much measurement of
processes, process integration, or process change. For example, are electronic services
requiring a greater reliance on outsourcing or greater-than-normal investments in
innovation? In terms of outcomes we have not seen much measurement of the impacts of
ESD on productivity, or internal rate of return (IRR) or equity or privacy or security.
Perhaps more important, we have not seen extensive efforts to measure the perceptions of

Defining and Measuring Success in Canadian Public Sector ESD Page 26


Professor Jerry Mechling and Charles Vincent April 24, 2001
various stakeholders in the system (front-line workers, supervisors, overseers, clients, and
the general public). These measures can be extremely important. People are often the
most cost-effective measuring instruments for many points in the value chain and – more
important – social value is ultimately a human construct that depends on how people
react to the world, not just the state-of-the-world itself. Measures that might be relevant
then are not only those of individual services or services in the aggregate, but measures of
trust in the competency of government and/or support for further investments in
electronic service delivery.

Are we moving toward public targets and milestones at the expense of other success
criteria? You only get what you measure, so are we only getting what we measure?

While little systemic measurement is available for many inputs, processes, outcomes, and
human perceptions, there is disparate research that yields evidence of significant
shortcomings in ESD performance overall. For example:

§ Research indicates that the digital divide is still wide.40


§ While there is some measurement of privacy and security, we see all-too-frequent
scandals or problems in those areas.
§ The UK Conservative Party released a policy statement suggesting the UK’s
emphasis on quantitative targets has hampered its efforts to re-engineer services.41
§ A study from the UK Local Government Association (LGA) argues that there has
been no significant movement toward reinvention, only automation.42 A report
from the Australian Auditor-General also notes the need to reassess business
processes.43
§ Polls indicate that many citizens feel disconnected from their government.44

In summary, there appears to be an uncertain relationship between the high-level goals


expressed by government in launching e-government initiatives and the activities
undertaken in pursuit of these goals. Isolated, program-specific actions, while valuable,
will not enable government in the aggregate to achieve the type of transformation that e-

Defining and Measuring Success in Canadian Public Sector ESD Page 27


Professor Jerry Mechling and Charles Vincent April 24, 2001
government promises to promote. We need a measurement framework that focuses
attention on factors that will enable cooperation and collaboration across program,
jurisdictional, and sectoral boundaries. Granted, much of what we are measuring clearly
needs to be measured and as momentum builds behind different movements, it is natural
and appropriate to look for simple measures to get things started and galvanize support.
We need targets such as "Y% online by X date." At the same time, the measurement of
complex systems requires a certain complexity, and Canadian ESD might benefit from
trying to fill out a balanced scorecard or matrix of measures a bit more thoroughly. That
is what we will explore below.

Filling in the Gaps: Some Options for the Future

What are the primary options for Canadian governments to consider for measuring and
assessing progress on electronic service delivery? In specifying these options, it is useful
to vary both what we measure and how we measure. There are a number of outcomes
and values we clearly care about as well as the causal chain that leads from input to
outcome. We can use different measurement instruments and different scales of
interpretation as we seek to understand the relevant cause-effect relationships. In varying
how we measure and interpret, however, we must remember that our resources are finite.

Thus, we can vary:


§ the outcomes or values to be measured (e.g. effectiveness, efficiency, privacy,
security, equity, democratic legitimacy, etc.)
§ the value chain steps to be measured (inputs, processes, outputs, outcomes), and
§ the instruments (e.g., accounting records vs. participant surveys) and scales used
for comparison (versus goals, versus other service producers, versus a “maturity
model,” etc.)
§ the level of effort invested (low, medium, high)

Defining and Measuring Success in Canadian Public Sector ESD Page 28


Professor Jerry Mechling and Charles Vincent April 24, 2001
Figure 4: Measurement Typology

Outcomes or Value Chain Instruments Level of


Values Steps to be and Scales Effort
to be Measured
Measured

Measurement Framework
What is the optimal “balanced scorecard”

Varying these factors in different ways to arrive at a measurement framework, the above
typology could produce hundreds of permutations. For the purpose of discussion and
debate, let us for now consider the following ten opportunities. These options are not
mutually exclusive and, in fact, often overlap. Furthermore, they are not set out as a
progression of options, but rather as a series of measurement opportunities to structure
the debate and discussion at Lac Carling V.

Figure 5: Ten Options for Measuring ESD Activities

1. Maintain the Status Quo


Vary the Outcomes or Values to be Measured
2. More Outcome/Value Measures
Vary the Value Chain Steps to be Measured
3. More Input/Process Measures
4. More Cross-Boundary Measures
5. More Innovation/Transformation Measures
Vary the Instruments and Scales
6. More Stakeholder Surveys
7. More Measurement Against Goals and Time Frames
8. More Competitive Benchmarking
Vary the Level of Effort
9. More Analysis of Feedback
10. Significant Increase in Resources and Level of Effort

Defining and Measuring Success in Canadian Public Sector ESD Page 29


Professor Jerry Mechling and Charles Vincent April 24, 2001
1. Maintain the status quo. When it comes to measuring and assessing our electronic
service delivery activities, what is the “status quo”? How should we describe our present
level of effort? What are we doing to measure and assess results in terms of outcomes or
values increased (or decreased)? What are we doing to measure and assess our production
processes in terms of inputs, processes, and outputs? What measurement instruments and
what scales are we using to assess our progress and make decisions about the relative
priority of our future actions?

By answering these questions, we can describe the status quo or baseline against which
other measurement and assessment options can be analyzed. As we think about changes
in what we are doing, we can make comparisons against our measurement status quo. Is
what we are doing now good enough for the future or are there a significant measurement
opportunities out there to be addressed?

2. More outcome/value measures – Measurement of the outcomes and values affected by


ESD. While the stated goals of ESD range from improved service access and efficiency
to economic development and government saliency, most of today’s measurement
attention is focused on simple estimates of the availability of online services.
Recognizing that availability is necessary and is a good early measure of progress, what
might we do to probe more deeply into progress on the more fundamental results and
goals of ESD? For example:
• access. To what extent is ESD improving or at least changing the patterns of access to
government services? Here we should think not only of the fraction of government
services available online, but the fraction taking place online. Furthermore, what
fraction of those online interactions is taking place during hours when government
offices are closed or in languages not typically available in the local government
office? Can we estimate the savings in time or money that citizens are realizing by
using online services instead of taking time off from work or family life to handle
their government business through other channels? To what extent do citizens and
businesses perceive online services to be more valuable because of that accessibility?

Defining and Measuring Success in Canadian Public Sector ESD Page 30


Professor Jerry Mechling and Charles Vincent April 24, 2001
In a “best of all possible worlds,” what fraction of governmental transactions might
we hope would be conducted end-to-end (not just available) online by 2005?
• efficiency. To what extent is ESD improving or changing the cost per unit of
government services? As a proxy measure, what portion of government services are
being delivered on a “self-service” (rather than a “staff-assisted”) basis, and what
might be a reasonable goal for self-service delivery by 2005? As a more thorough set
of efficiency measures, what would we learn from a series of activity-based-costing
studies of ESD in a variety of government (and non-government) settings? On a per-
capita basis, what would be the annual dollar value to Canadians if ESD were
“completely” implemented? From a stakeholder perspective, how do citizens (and
overseers) perceive and value the purported efficiencies of ESD?

Beyond access and efficiency, there are other important values at stake that could (and
should) be measured and assessed. We should consider more thoughtful and precise
measures of privacy, focusing both on citizen perceptions and also on measures such as
legislation proposed, lawsuits filed, judgments awarded, etc. Similar efforts could be
made to measure security. Would it be valuable to try and measure the "multiplier effect"
of more efficient government on the overall performance of the Canadian economy and
its capacity to create high-value jobs? What about measuring the impacts of ESD on
equity? Broader yet, what evidence can we assemble that links ESD to citizen perceptions
of governmental competence, legitimacy, and trust?

If we are motivated to invest in ESD for many reasons other than making services
available electronically – and we clearly are – then we need to gain a balanced scorecard
of the results being generated by our ESD activities. That will require us to collect more
results-oriented measures than we presently collect.

3. More input/process measures – Measurement of a diversity of inputs and activities


across the value chain. Much of our attention is now focused on the percentage of
services accessible electronically as an immediate output of ESD activities and as an
intermediate indicator of the value of those activities. Availability is clearly an important

Defining and Measuring Success in Canadian Public Sector ESD Page 31


Professor Jerry Mechling and Charles Vincent April 24, 2001
link in the ESD value chain, but it needs to be augmented by, and balanced against, other
measures of the inputs, processes, and outputs involved in producing and distributing
government services electronically.

It may be particularly important to get more precise about the estimated benefit/cost
(B/C) ratios for electronic services, where the benefits include all the measurable savings
to clients, to taxpayers, and to production units. We need to measure outcomes and
values, but we also need to understand the relationship between ESD activities and those
outcomes. To do this kind of assessment we will need better estimates than we presently
have about the dollars invested in electronic services, going beyond the typical
contracting and out-of-pocket costs to include the full costs of resources allocated for
training, operations, maintenance, etc. Measuring costs may not have been a priority
when we were first getting started with ESD since the dollar figures involved were small.
But the costs are growing now and if the overall economy continues to weaken
governments will be pressured to cut back on all new programs that can not prove their
worth.

4. More “cross-boundary” measures – Measurement of activities to integrate services


across program, agency, jurisdictional, and even public-private boundaries. This is the
emerging set of activities that seems to be creating the most important challenges for
managing ESD – and will inevitably prove critical to future ESD development. Early
experiments with putting services online one-at-a-time are beginning to give way to
broader efforts that coordinate service delivery across organizational and jurisdictional
boundaries. Future success will depend on our ability to manage strategic alliances.

How should we be measuring these developments? What information and feedback will
help assess our progress on the “integration agenda.” At one level we should track the
level of resources we are investing in multi-agency and in public-private collaborations?
What are the trends? Are we moving towards more outsourcing, leaving government with
the work of “steering, not rowing”? How many partners do we have? Even more
important than the number of partners involved, however, is the quality of the

Defining and Measuring Success in Canadian Public Sector ESD Page 32


Professor Jerry Mechling and Charles Vincent April 24, 2001
involvement. As strategic alliances, these collaborative efforts will only work if all
partners are committed to achieving the same goals. Are the goals of each party
compatible? What happens when benefits that are important to one or more partners fails
to materialize, especially when these benefits are explicitly set out in contracts or
negotiated agreements? What are we learning about resolving the turf wars that are more
difficult to resolve through multi-agency negotiation than through authoritative decisions
by “the boss.” Are all the partners engaged and offering meaningful contributions? Are
the corporate cultures blending? As we move further into a world of cross-organizational
collaboration, what is happening to project success rates? If success rates are falling, is
that cause for concern or simply an indication of par for a more difficult course?

5. More innovation/transformation measures – Measurement and assessment of how we


are adapting to demands for new principles of control and “best practice.” Many leaders
now believe that we are entering into a new phase of ESD, a phase where we recognize
ESD projects cannot be entirely controlled within individual government agencies.
Instead, multi-organizational projects will require negotiation across the boundaries that
separate programs and/or agencies and/or jurisdictions and/or public and private sector
institutions. These multi-organizational projects may depend heavily on the presence of
infrastructure that supports coordination through accepted standards rather than top-down
imposition of authority. They may also require new forms of “best practice” that will not
be obvious to those who have been successful with the internally-oriented projects that
have dominated previously. In the end, however, they will rely most on the ability to
innovate in order to solve problems and achieve goals in a collaborative environment.

Given the importance of innovation, we should be tracking the resources invested in


innovation and the downstream benefits that result from these investments. We need a
better understanding of how to maximize R&D and other innovations-oriented
investments in ESD.

Furthermore, when systems are being radically reconfigured and reinvented the
measurement and control process is extremely important. Parties within the system need

Defining and Measuring Success in Canadian Public Sector ESD Page 33


Professor Jerry Mechling and Charles Vincent April 24, 2001
to understand the goals they are pursuing and whether their current (and often unfamiliar)
activities are moving them toward these goals. When the overall character of the system
is changing, however, the feedback and control metrics may also need to change. For
example, in the Lac Carling V Synthesis Paper, it is suggested we focus on developing
relationship management, information management, and transaction management as core
competencies. Should we be measuring our progress in these areas?

6. More stakeholder surveys – to understand and explore perceptions of multiple aspects


of the electronic service system. We do a little bit with surveys now to track responses to
ESD, but not nearly as much as is done by private corporations trying to track their
markets. We do not even do as much survey work for ESD as we do for many political
and governmental issues of arguably less societal importance, including deciding what
phrase to turn or stance to take on an issue that may turn an election campaign.

The broad movement towards electronic service delivery will undoubtedly be more
important to long-term social welfare than many of the issues that are currently the
subject of intensive polling and surveys. What would it mean to the ESD movement to
take survey work seriously and to systematically probe the views of producers (front-line
workers, supervisors, senior management), consumers (citizens and the general
public/taxpayers), and overseers (elected officials and the press)?

Surveys can be an extremely cost-effective measurement tool because they can conserve
costs through sampling techniques. They can also be structured to leverage their costs
over time if a core structure of questions allows the database to be deepened and
extended through subsequent surveys. Stakeholder surveys can provide critical
information on the perceived value of results. In addition, they can provide equally
critical information about implementation as they help to measure the perceptions and
feasibility of particular projects. Drawing on the UK’s “People’s Panel” example, would
it be valuable to create a "citizen panel" of substantial size to serve as focus group, test
market, and outcome evaluator of our ESD work? Would it be possible to talk a

Defining and Measuring Success in Canadian Public Sector ESD Page 34


Professor Jerry Mechling and Charles Vincent April 24, 2001
foundation into providing a scientifically designed set of survey questions that multiple
organizations could utilize as ESD is rolled out over the coming years?

Stakeholder surveys, of course, are not the only instruments for collecting feedback on
our ESD systems. We can use other instruments such as accounting systems and
performance measurement systems. In looking at the various data collection instruments
at our disposal, however, one of the most cost-effective instruments seems to be
stakeholder surveys. This instrument should be used to collect views from front-line
workers, managers, and overseers as well as from individual citizens.

7. More measurement against goals and time frames. A critical element of all measuring
systems is not just the instrument that takes the measures, but the scale against which the
measures are reported and judgments made. A “15” on a test means something different
if it is considered an “F” rather than an “A.”

The most obvious and readily available scale is comparing current performance against
performance of the same organization at an earlier period of time. For ESD this
comparison can be difficult because many programs do not have much of a history
against which to report. Nevertheless, planning for such reports should be a priority.

These comparisons become more important when the “expected” pattern of performance
is not “what you did last time,” but some “normal growth” above what you did last time.
There are many “maturity models” for judging performance when growth is expected,
and many of the consulting firms have developed such models for ESD.

In all of these comparisons there is a search for meaning, which is usually a search for
ways to improve performance in the future. The most explicit and straightforward of
these comparisons occurs when specific targets have been set for performance. While
many governments have set such targets (100% E by 2003!), the goal-setting process has
often failed to penetrate very deeply. While it is possible to have too many goals as well

Defining and Measuring Success in Canadian Public Sector ESD Page 35


Professor Jerry Mechling and Charles Vincent April 24, 2001
as too few, the ESD movement in general might improve its ability to assess progress by
working to analyze and define a broader and more balanced set of goals.

8. More competitive benchmarking. How are others doing this work and – in particular –
how are the “best performers” in the world doing it? Governments do a certain amount
of benchmarking, but might learn a great deal from more aggressive benchmarking on
ESD issues and opportunities. Is the EU’s benchmarking model in which they are
tracking the availability of services a useful one to copy? Can we expand our
benchmarking effort to look at process or value measures as well? Developing broad
scorecards on electronic service delivery and publishing the results will certainly raise the
competitive stakes. While the need for collaboration suggests that too much competition
could be detrimental, we should not overlook the power of competition to push us
forward.

9. More analysis – More effort to use the feedback we get from measures and assessments
in the politics and management of government services. Feedback is not valuable in and
of itself, but rather to the extent that it informs analysis and choices of future activities.
To be useful in this way, it must be inserted into powerful decision-making processes.

In this context, how well is the feedback available to the ESD movement being used
today and how could it be used better? The primary efforts to use performance measures
in governmental decision making tend to make them a requirement of the budgeting or
strategic planning process. We are beginning to see such requirements in governmental
ESD work – for example the Government Performance and Results Act and its (less than
stellar) implementation within the U.S. What could be done to insert feedback measures
more forcefully into government decision making and/or make such measures more
visible and influential with the public?

10. Significant Increase in Resources and Level of Effort – Perhaps a doubling or tripling
of present levels of investment in electronic services measurement. This option would
involve a combination of all the options above. But what kind of expanded effort would

Defining and Measuring Success in Canadian Public Sector ESD Page 36


Professor Jerry Mechling and Charles Vincent April 24, 2001
be reasonable in terms of clarity and accountability gained for resources expended?
Would it be a huge effort and wasteful to do this or would it be a reasonable cost given
the uncertainties that might be cleared up and the size of the overall ESD investments
being made? In many complex systems, the resources consumed in the control loops are
an extremely small fraction of overall system resources. When your house is being heated
inefficiently, it is often more efficient to invest in better and more strategically placed
thermostats than in trying to improve the efficiency of the burner. Is this the case for
ESD as well – would our time, money, and energy be better spent with more
measurement? What would we do if we took the measurement and feedback process
truly seriously?

11. Some other option. After all, there is always SOME other option.

The above is a starter list to encourage thinking about measurement and assessment of
government ESD in Canada. We need to balance resources invested in direct activities
against those invested in measurement for feedback and control. We need to measure the
right processes and outcomes while using the right instruments and evaluation scales. So
far we have not had to worry too much about measurement problems. Everyone has been
readily convinced that ESD offers a good set of opportunities. Now, however, as the
movement gets bigger and more costly, and as economic growth is growing more
problematic, we are sailing in more difficult waters. We clearly need better tools for
navigation and control.

Evaluating ESD Measurement Options

As we think about these and other options for improving ESD measurement, what are the
primary values at stake and the criteria to be used in making decisions? How can we
evaluate the above ten (or eleven) options and make practical decisions about our
investment in measurement?

Defining and Measuring Success in Canadian Public Sector ESD Page 37


Professor Jerry Mechling and Charles Vincent April 24, 2001
To begin, consider the classic criteria of effectiveness, efficiency, and implementation
difficulty.

Effectiveness – or the potential value to be created through improved systems control.


To what degree will we improve ESD performance by investing in various measurement
and assessment options? In general we might expect the value or effectiveness of
feedback to improve when:
• The measures address important upcoming issues, such as proposals to expand
ESD efforts, proposals to take significant costs out of government service production
processes, and/or proposals to integrate services across organizational and political
boundaries.
• The measures address issues where the results may be uncertain, for example,
while many people believe that self-service should be expanded “where appropriate,”
there is substantial uncertainty over the degree to which self-service approaches
might apply for government services over the next decade or so. Uncertainty is also
related to the timeliness of feedback (earlier is better) and to its accuracy (more
accurate is better, but only to the point that it is “worth it” in terms of trade-offs with
other criteria such as timeliness).
• The measures are readily understood and accepted by key stakeholder groups,
such as key business and client groups, front-line public managers, legislative
overseers, and the general public.

Efficiency – or the degree to which control will be improved relative to the costs required
for improvement. Efficiency can be improved by establishing standards so that
measurements taken independently can be aggregated into more comprehensive
measures. Standards also allow measures to be compared across time and jurisdictional
boundaries. Efficiency, from the perspective of the government, may also be improved
by giving incentives to other institutions to do the measurement (as when various “good
government” groups do the surveys or other measurement work). We are normally
interested not only in the immediate cost of getting a measurement program underway,
but also in the long-term costs of keeping it going.

Defining and Measuring Success in Canadian Public Sector ESD Page 38


Professor Jerry Mechling and Charles Vincent April 24, 2001
Difficulty – or the degree to which implementation will be risky. Difficulty tends to
increase, for example, to the extent that projects are more costly, take more time, are
more complex and confusing, and/or face powerful opposition.

Using the above criteria, we have tried to assess each of the ten measurement
opportunities on a quantitative scale of 1 (low) to 5 (high). Such a scale is obviously not
“objective” in the sense that reasonable people will disagree with our numbers. That
said, we think that the results may be useful to provoke important dialog and learning. If
nothing else, the framework provides a systematic way of organizing the judgment calls
that need to be made in designing investments in ESD measurement and serve as the
basis of our conclusions and recommendations. Our results are based on our own
research and perspectives and can be summarized as follows.

Figure 6.
Preliminary Evaluation of the Value, Efficiency, and Difficulty
of Implementing Eleven Options for Measuring and Assessing
Canadian Electronic Service Delivery Activities

Measurement Options V E D Comments


1. Status Quo 2 4 1 Few measures other than “availability”
2. More outcome/value measures 4 3 2 Many stated values overlooked today
3. More input/process measures 3 2 4 Complex and controversial, esp. with collaboration
4. More x-boundary measures 4 3 4 Difficult but where the decisions are being made
5. More transformation measures 4 3 4 Much as above as innovation is now x-boundary
6. More stakeholder surveys 5 4 2 To assess tactics as well as strategic goals
7. More goals and targets 3 4 4 Not very costly but can be controversial
8. More competitive benchmarks 4 2 4 Externally oriented to provide light and heat
9. More analysis of feedback 3 3 4 Inserting measures into politics is difficult
10. Increased level of effort 5 1 4 “Next step” measures first, THEN a “leap”?
11. Other??? ? ? ?

Note: V = value or effectiveness on a 1 to 5 scale (1 = low)


E = efficiency on a 1 to 5 scale (1 = low)
D = implementation difficulty on a 1 to 5 scale (1 = low)

Defining and Measuring Success in Canadian Public Sector ESD Page 39


Professor Jerry Mechling and Charles Vincent April 24, 2001
Conclusions and Recommendations

Using the evaluations presented above, how can we characterize present patterns of ESD
measurement and assessment? What sorts of discussion and issues should we focus on
most closely at Lac Carling V? Consider the following.

1. ESD measurement so far has been modest but probably adequate – what we have
done has focused heavily on the availability of electronic services. In getting the ESD
movement started, measurement has frankly not been all that important. We have enjoyed
a broad consensus that getting started has been a good thing. We have used targets to
build support and momentum, with most of the measurement activity focused on the
breadth of services and not on the depth or richness of the services.

Recently, however, measurement and assessment is becoming more important. This is


because of the growing size of ESD investments and the growing level of uncertainty
about how to manage a large array of such investments wisely. Up to now, ESD budgets
have been rather small, and we have largely been putting in place the infrastructure and
agency-by-agency projects that governments have “known how to do.” Now, however,
investments are getting bigger. Perhaps more important, the new ESD investments
require difficult “cross-boundary” work where we have less experience and knowledge
about how to succeed. Given the stakes and uncertainties, we need better feedback to
keep our ESD activities under control. Tracking the percentage of services made
available electronically is a good thing to do, but it is not nearly enough to provide the
feedback and assessment we now need for reliable progress.

2. A top measurement priority should be to develop good stakeholder surveys as a


core ESD infrastructure. These should be designed so that individually administered
surveys can be aggregated across time and across boundaries. In this way individual
agencies can measure their own performance while contributing to and benefiting from a
growing pool of data that will roll up into larger and longer-term measures of the

Defining and Measuring Success in Canadian Public Sector ESD Page 40


Professor Jerry Mechling and Charles Vincent April 24, 2001
relationships between activities and results. This survey work should also help with the
implementation of ESD work. If we stay close to stakeholder perceptions, we give
ourselves the best possible chance for maintaining public support as we move forward.

The surveys to be developed should:


• Be carefully designed. It is easy to bias results through suggestive wording.
The design of a battery of questions to be used as a “measurement
infrastructure” should be done with care by qualified experts. It should be
done so that, to the extent possible, it could be used by programs, agencies,
jurisdictions, and even be harmonized with surveys used by public and private
sector e-service providers around the world. The questions also need to be
carefully designed so as to reach across program delivery boundaries, giving
insight into the depth of service integration as well as the breadth of online
availability.
• Hit key stakeholder groups. We will ultimately need a very large sample in
order to hit the breakdowns most useful for ESD design and implementation.
We can, however, build to this size through incremental smaller surveys if
there is consistency in the questions and procedures used. We should
ultimately include stakeholders including: service clients (hitting diverse
demographics and services), front-line workers (for various services), field
managers, senior managers, executive and legislative overseers and staff,
taxpayers, public interest associations and groups, the press, and the general
public.
• Hit perceptions of key results. How effective are services and how effective
are electronic services in comparison to services through other channels? How
accessible? How efficient? Views and concerns about privacy? About
security? About the competency and legitimacy of government services and
the governance process associated with the move to e-government?
• Hit perceptions of key processes. What do stakeholders see happening in
terms of the size of investments in ESD? The need for innovation? The need
for cross-boundary integration?

Defining and Measuring Success in Canadian Public Sector ESD Page 41


Professor Jerry Mechling and Charles Vincent April 24, 2001
3. A “next step” priority is to measure not only the availability of electronic services,
but also their impacts on valued outcomes related to access, efficiency, privacy,
security, and – to the extent possible – even governmental legitimacy. Better
measures of results are especially important as the investment stakes get higher and the
economic and project environments get riskier. We have garnered some enthusiastic
early support for ESD because people are sure it will improve access and efficiency. But
does it? And how can we make the case?

Some of this measurement should be done as stakeholder surveys (see above). But other
measurement will also be important. If carried out on a sample rather than a system-wide
basis, new measurements could greatly improve the feedback needed for guidance yet not
be very costly. These efforts would not collect measures from all agencies, but from a
well-chosen sample. As we discussed in the beginning of this paper, the values or
objectives for each jurisdiction will necessarily vary according to the values and
objectives of society. Nevertheless, some possible measures could include:
• Measure access. Use samples of different functions and transactions to measure the
volume and percentage of transactions actually taking place through the new
electronic channels in comparison with traditional face-to-face and other offerings.
Are changes in access changing the total penetration rates of government services?
Are they changing the demographic patterns of service use? As a simple but possibly
telling number, do they significantly change the portion of government transactions
that occur outside of normal “government hours?”
• Measure service quality and efficiency. Design traditional time and motion studies
and/or participant observation studies to compare electronic versus other channels
with respect to: the unit cost of services to the government, the internal rate of return
(IRR), the cost to the citizen, the response time to citizen requests, and error rates.
Other indicators of cost and quality such as the migration of users from one channel
to another or the integration of service delivery channels might also prove valuable.
Taking both quality and cost into consideration, and using rigorous activity-based

Defining and Measuring Success in Canadian Public Sector ESD Page 42


Professor Jerry Mechling and Charles Vincent April 24, 2001
costing methodologies, what can we say about changes in the unit costs of equivalent
quality services as the shift is made to electronic delivery?
• Measure service privacy and security. This should be pursued through stakeholder
surveys (perceived privacy and security), and also through other modalities. Efforts
should be made to measure changes in indicators of privacy and security as electronic
channels are utilized – for example, tracking the number and seriousness of reported
incidents of privacy and security violations or identifying the number of times privacy
and security are augmented through the development of electronic services. In
addition, efforts should be made to measure the relationship between investments in
protection and their effectiveness. Given the motivation of the parties involved – both
those committing privacy and/or security violations and those being violated – it can
obviously be difficult to get good information. But we need to improve the ratio of
reality to rhetoric as these issues get discussed.
• Measure governmental legitimacy. As with the other values at stake, surveys to gather
the perceptions of various parties can be extremely valuable. A more in-depth
probing and monitoring of changing perceptions of governmental authority and
politics will also be needed to ground new e-government initiatives in a well-
informed reflection of the will of the people. To a certain respect values such as
governmental legitimacy are political issues and should be the focus of political
overseers. While we hope more political overseers get involved with ESD and drive
these issues, governmental legitimacy should be an issue of concern to every public
servant.

4. Search for feedback to guide the growing emphasis on cross-boundary


collaborations and innovations that may require fundamentally new means of
governance and control. The consensus view is that we are increasingly headed into
ESD territory where the interesting projects are cross-agency, cross-jurisdictional, and
even cross-sectoral. These projects depend on coordination through collaboration and
negotiation rather than reliance on authoritative conflict resolution and decision making.
The old management methods and the old control measures may no longer work very
well. We need to learn as much as we can about how these projects are faring, and

Defining and Measuring Success in Canadian Public Sector ESD Page 43


Professor Jerry Mechling and Charles Vincent April 24, 2001
whether or not we need to develop a new set of best practices to control such projects.
To begin, how much are we investing in cross-boundary work and what are we gaining
from these initiatives? How are strategic alliances learning to work together? When we
provide the kind of “one-stop” integration that cross-boundary work makes possible, how
valuable is the new channel in comparison to what was available before?

Note that we are now playing a rather new game that relies much more on innovation
than was true for governments in the past. If this is true, we need to measure more
assiduously than before the size of the investments we are making in innovation. Our
innovations portfolio may be the most important part of our budget, even if it is a
relatively small portion overall. We need to develop the knowledge bases that will let the
“fast followers” of government move faster and more effectively in diffusing
authentically better new practices.

None of this is readily condensed into a single measure of success. This exercise is NOT
like saying “100%E by 2003!” Rather, we need to assemble evidence to make a reasoned
assessment of how the new investments are faring. It is a measurement task, but of a
complex and often non-quantitative sort.

5. Don’t avoid the unpleasant stuff: go for more benchmarking. While the measures
described above should be extremely valuable to the public sector community as a whole,
they are often based on the kind of samples and studies that avoid uncomfortable
comparisons of one program or ministry against another. If so, they will not be enough.
We need feedback for individual programs and even individual managers as well as the
public sector community overall. To get those measures, we should focus on the kind of
comparative data that is required for a rational assessment of performance and
accountability. Benchmarking is necessary to help drive progress. While much of what
we will do together in deploying ESD can and will be done collaboratively, we also need
to make the “race” a bit more visible. Comparative data and benchmarking for ESD are
required to help overcome complacency and resistance to change. There are, of course, a
number of different ways such benchmarking could be structured. We could focus on

Defining and Measuring Success in Canadian Public Sector ESD Page 44


Professor Jerry Mechling and Charles Vincent April 24, 2001
Canada’s progress versus other jurisdictions, provincial progress versus other
jurisdictions, municipal progress versus other jurisdictions, agency progress versus other
jurisdictions, or any combination thereof. What is important will be finding comparative
measures to help us move forward with ESD, leveraging the power of competition in an
environment that demands cooperation and collaboration.

6. Beyond the obvious next steps, maybe we need a very substantial increase in
measurement and assessment efforts? Certainly, after a few more years to improve
measurement through high priority steps like those above, we will perhaps be at the point
to put a much larger effort into measurement. After all, the measurements are cheap,
especially if they buy better control.

* * * *

As we mentioned in the beginning, the purpose this paper has not been to assess whether
Canada is succeeding with ESD, to gauge how different jurisdictions in Canada compare
with one another, or to judge how Canada compares to other jurisdictions. Instead, we
have offered a starting point for discussing how we can and should define and measure
progress on the ESD agenda. Our goal has been to encourage debate and discussion at
Lac Carling V and even to assist the group in developing specific actions to follow up
after the conference. It is with this thought that we turn our attention to Lac Carling V.

Defining and Measuring Success in Canadian Public Sector ESD Page 45


Professor Jerry Mechling and Charles Vincent April 24, 2001
Appendix A:
Recent ESD Highlights from Canada’s Provincial and Territorial Governments

British Columbia:
- Launched BC Connects – first phase of ESD portal
- Integrating municipal governments in OneStop business registration
- Soon to deliver ability to accept credit card payments online

Alberta:
- Launched One-Window Initiative to coordinate service delivery across boundaries
- Extended Alberta wellnet to link health care community together

Saskatchewan:
- Launched citizen-centered portal re-organized by functionality
- Finalized government online strategy with goals to have all forms available
electronically by 2002 and 90% of transactions by 2004
- Implemented “shopping cart” functionality and credit card transactions

Manitoba:
- Launched online personal property registry
- Enabled students to calculate eligibility for financial assistance and apply online
- Enabled parents to calculate day care subsidy; developing a database to search for
available day care spaces

Ontario:
- Set a goal of 2003 to be a world leader in delivering services online
- Issued an RFP for an Integrated Service Delivery platform for individuals
- Launched E-Laws to provide online access to up-to-date provincial laws

Quebec:
- Passed a law mandating the modernization of public services to improve service
quality
- Dedicated resources to connect families and businesses to the information economy
- Focusing on security (especially PKI) and looking toward a "smart" health card

New Brunswick:
- Launched land title registry online
- Launched motor vehicle registrations online
- Implemented an e-commerce suite that includes “shopping cart” style functionality

Nova Scotia:
- Launched online motor vehicle renewal; includes real-time access to survey feedback
- Launched business registration and online procurement information
- Modeled Service Nova Scotia site to offer seamless citizen-centered access to
services

Defining and Measuring Success in Canadian Public Sector ESD Page 46


Professor Jerry Mechling and Charles Vincent April 24, 2001
Prince Edward Island:
- Went live with Canada's first fully integrated justice system
- Connected homes of terminally ill patients to a nursing station at a local hospital
allowing patients to spend more time at home during their last days while monitoring
vital data supplemented with video conference facilities (Telehospice project)
- Launched applications to renew motor vehicle registration and report potholes

Newfoundland:
- Launched motor vehicle registration and traffic payment transaction engine that is
integrated with government backend as well as the Royal Bank backend
- Enabled students to apply for financial aid online

Nunavut:
- Completed Phase 1 of IT strategy in which IT functions were centralized and
enterprise applications launched in a stable environment. Phase 2 will review
inventory and programs in order to prioritize future development
- Established a task force to examine broadband access in Nunavut
- Provided Inuktitut syllabus online as a first step toward making Inuktitut the working
language of government

Northwest Territories:
- Focused on developing telemedicine and distance education capacity
- Using Wire North Initiative to bring stakeholders together

Yukon:
- Redesigned web site with citizen-centered functional index
- Posted single change of address form (not yet interactive)
- Working with federal, municipal, and first nations governments to integrate telephone
listings as a first step toward stronger citizen-centered integration

Defining and Measuring Success in Canadian Public Sector ESD Page 47


Professor Jerry Mechling and Charles Vincent April 24, 2001
Appendix B: Australia’s Eight Strategic Priorities

Strategic Priority 1 Strategic Priority 2


Agencies to take full
advantage of the Ensure the enablers are
opportunities the Internet in place
provides

Strategic Priority 8 Strategic Priority 3


EnhanceGovernment
Communicate with
stakeholders
‘We will deliver all appropriate Online services in
regional Australia
Government services online via the
Internet by 2001’
Strategic Priority 4
Strategic Priority 7 Prime Minister’s Investing for Growth
Enhance IT industry
Facilitate cross-agency
statement, 1997 development impact of
services Government Online
initiatives

Strategic Priority 6 Strategic Priority 5

Monitor best practice Government business


and progress operations to go online

Proxy measures being tracked include:

Agencies to take full advantage of the opportunities the Internet provides


- percentage of services available online
- types of services online (e.g. static information, feedback, transactions)
- complexity of services available online
- methods of establishing client needs and preferences
- methods of marketing and promoting ESD
- types of information available online
- percentage of government information online (as required by law)
- percentage of government forms available online

Ensure the enablers are in place


- percentage of agencies using authentication and encryption technology
- percentage of agencies compliant with privacy guidelines
- percentage of agencies compliant with security standards
- percentage of agencies compliant with metadata standards
- percentage of agencies compliant with information publication standards
- percentage of agencies compliant with record keeping standards
- percentage of agencies compliant with W3C priority 1 standards for accessibility

Defining and Measuring Success in Canadian Public Sector ESD Page 48


Professor Jerry Mechling and Charles Vincent April 24, 2001
Enhance Government Online service in regional Australia
- number of services targeted at rural or regional Australia
- percentage of rurally-targeted services that are integrated with other services
- complexity level of services targeted at rural areas

Enhance IT industry development impact of Government Online initiatives


- percentage of agencies who will require help from the private sector

Government business to go online


- percentage of government suppliers paid electronically
- percentage of payments (by volume) paid electronically
- percentage of suppliers ordered from electronically

Monitor best practice and progress

Facilitate cross-agency services


- percentage of agencies delivering at least one integrated service with another agency
- percentage of online service that are integrated
- percentage of agencies who have identified opportunities for integration

Communicate with stakeholders


- qualitative list of events and activities aimed at communicating Government Online
objectives

Defining and Measuring Success in Canadian Public Sector ESD Page 49


Professor Jerry Mechling and Charles Vincent April 24, 2001
Appendix C: Ontario’s Six Objectives

Enhancing services.
- timeliness, reliability, accuracy
- client centric streamlined processing harmonized across programs/jurisdictions
- universal access, choice of channels
- keeping in tune with what clients want and need

Enhanced accountability.
- clients know who to hold accountable for what
- service providers and government officials are accountable for their actions
- compliance targets and service level agreements are met or exceeded
- decisions are made quickly and transparently, with full information

Increased efficiency and effectiveness


- reduce per unit processing time and costs
- reduce overlap/duplication
- reduce the overall costs of delivering services to/for those in need

Transformed public sector systems

Increased economic growth


- sustained strong economy
- high employment
- world class IT and communications infrastructure
- strong knowledge base
- attract and retain world class professionals

Enhanced relevance
- lead in third party benchmarks
- brand recognition
- meet or exceed take up rate targets
- be sought after to form alliances/partnerships

Defining and Measuring Success in Canadian Public Sector ESD Page 50


Professor Jerry Mechling and Charles Vincent April 24, 2001
Selected Resources

Reports and Papers

Accenture. (2001). eGovernment Leadership: Rhetoric vs Reality – Closing the Gap.

Andersen Consulting. (2000). Connecting the Dots?

Australia, Government of. (2000). Government Online Progress Report.

Australia, Government of. (1997). Investing for Growth.

Australian National Audit Office. (1999). Electronic Service Delivery, including


Internet Use, by Commonwealth Government Agencies.

Canada, Government of. (2001). Government On-Line: Serving Canadians in a Digital


World.

Cohen, Steven and William Eimicke. (2001). The Use of the Internet in Government
Service Delivery. Prepared for the PricewaterhouseCoopers Endowment for the Business
of Government.

Council of the European Union. (2001). eEurope 2002: An Information Society for All:
Impact and Priorities. Communication from the Commission to the Council and the
European Parliament.

Council of the European Union. (2000). eEurope 2002: An Information Society for All:
Action Plan. Prepared for the Council and the European Commission for the Feira
European Council.

Deloitte Research. (2000). At the Dawn of e-Government: The Citizen as Customer.

Erin Research Inc. (1998). Citizens First.

Forrester Research (Canada). (2001). Canada’s eGovernment Blueprint.

Heath, William. (2000). Europe’s readiness for e-government. Kable and Government
Computing.

Intergovernmental Advisory Board, Federation of Government Information Processing


Councils. (2000). Citizens Expectations for Electronic Government Services.

KPMG Consulting. (2001). Evaluation and Performance Measurement Framework for


the GOL Infrastructure Program. Discussion Draft prepared for Treasury Board of
Canada Secretariat.

Defining and Measuring Success in Canadian Public Sector ESD Page 51


Professor Jerry Mechling and Charles Vincent April 24, 2001
Momentum Research Group. (2000). Benchmarking the eGovernment Revolution: Year
2000 Report on Citizen Demand and Business Demand.

National Electronic Commerce Coordinating Council (NECCC). (2000) E-Government


Strategic Planning: A White Paper. Draft released at NECCC Annual Conference,
December 2000 in Las Vegas.

Netherlands, Government of. (2001). 25% electronic public service delivery in the
Netherlands.

Ontario, Government of. (2000). The Future is Here: A Progress Report on e-


Government in Ontario.

PricewaterhouseCoopers. (2000). Canadian Consumer Technology Study.

R.H.S. Consulting Associates, Robert H. Smith School of Business, University of


Maryland. (1999). Federal Office of Electronic Commerce: Performance Metrics for
FAFSA on the Web.

Schmidt, Faye Nella, with Teresa Strickland. (1998). Client Satisfaction Surveying:
Common Measurements Tool

Schmidt, Faye Nella, with Teresa Strickland. (1998). Client Satisfaction Surveying: A
Manager’s Guide.

Service New Brunswick. (2000). Service New Brunswick Annual Report 1999-2000.

Stanley & Milford. (2000). maxi Review: Attitudes and Behaviour of maxi Users.
Report prepared for Multimedia Victoria.

Texas, Comptroller of Public Accounts. (2001). Activity-Based Costing in Texas State


Government.

Stokey, Edith, and Richard Zeckhauser. (1978). A Primer for Policy Analysis.

Treasury Board of Canada Secretariat. (2000). e-Government Capacity Check: Criteria.

Treasury Board of Canada Secretariat. (2000). e-Government Capacity Check: Lessons


Learned Report.

United Kingdom Cabinet Office. (2000). e.gov: Electronic Government Services for the
21st Century.

United Kingdom, Central IT Unit. (2000). Modernising Government: Benchmarking


Electronic Service Delivery.

Defining and Measuring Success in Canadian Public Sector ESD Page 52


Professor Jerry Mechling and Charles Vincent April 24, 2001
United Kingdom, Central IT Unit. (2000). Information Age Government:
Benchmarking Electronic Service Delivery.

United Kingdom Local Government Association and the Department of the Environment,
Transport, and the Regions. (2001). e-Government Local Targets for Electronic Service
Delivery.

United Kingdom Local Government Association and the Department of the Environment,
Transport, and the Regions. (2001). e-Government Delivering Local Government
Online: Milestones and Resources for the 2005 Target.

U.S. Office of Management and Budget. (2000). Government Paperwork Elimination


Act (GPEA): Planning Guide and Model Report.

Victoria, Government of, with Simsion Bowles and Associates. (1998). Online Service
Delivery: Lessons of Experience.

Victoria, Government of. (1999). Government Online Cabinet Progress Reporting


System: Guidelines.

West, Darrell. (2000). Assessing E-Government: The Internet, Democracy, and Service
Delivery by State and Federal Governments.

Presentations

Australia, National Office for the Information Economy. (2000). TIGERS: Electronic
Framework (Signpost) Consultancy Report.

d’Auray, Michelle. (2000). Government On-Line: Serving Canadian in the Digital Age.
Presentation to North America Day delegates.

Erin Research. (2000). Citizens First 2000. Presentation to the Treasury Board of
Canada Secretariat.

EKOS Research Associates. (2000). Rethinking the Information Highway: Security,


Convergence, and the E-Consumer/E-Citizen. Presentation to the Treasury Board of
Canada Secretariat.

Milrad, Lewis H. (2001). Dealing with Security and Privacy: A Special Obligation for
Governments Online? Presentation to MuneGov 2001.

Wright, John. (2001). Canada’s Municipal eGovernment Conference and Exposition.


Presentation of Ipsos-Reid survey results to MuneGov 2001.

Defining and Measuring Success in Canadian Public Sector ESD Page 53


Professor Jerry Mechling and Charles Vincent April 24, 2001
Interviews and Correspondence:

The authors would like to thank the following people for taking the time to offer their
insights and ideas.

Jim Alexander (Treasury Board Secretariat, Canada)


Michelle d’Auray (Treasury Board Secretariat, Canada)
Bernard Beauchemin (L’inforoute gouvernementale et aux resources informationnelles,
Quebec)
Bill Drost (Information Technology Management Group, Provincial Treasury, PEI)
Steven Feindel (Service Nova Scotia and Municipal Relations)
Jim Hill (Information Services Branch, Government Services, Yukon)
Harry Hutchings (Treasury Board, Newfoundland)
Paavo Kivisto and Richard Clark (Ontario Integrated Service Delivery)
Marilyn Lustig-McEwen (Justice, Saskatchewan) and Lynn Oliver (ECD, Saskatchewan)
Lori MacMullen (CIMS NB) and Mary Ogilvie (Service NB)
Joan McCalla, (Management Board Secretariat, Ontario)
John Mills (City of Edmonton)
Ardath Paxton-Mann and Mike Cowley (Small Business, Tourism and Culture, B.C.)
David Primmer (Office of Information Technology, Manitoba)
Glenn Sargant (Informatics Planning and Systems Development, Nunavut)
Louis Shallal (City of Ottawa)
Robb Stoddard (Office of the Chief Information Officer, Alberta)
Rick Wind (Information Management, Financial Management Board Secretariat, NWT)

Defining and Measuring Success in Canadian Public Sector ESD Page 54


Professor Jerry Mechling and Charles Vincent April 24, 2001
1
National Electronic Commerce Coordinating Council, E-Government Strategic Planning: A White
Paper. Draft released at NECCC Annual Conference. December 2000.
2
Accenture, eGovernment Leadership: Rhetoric vs Reality – Closing the Gap. April 2001.
3
EKOS Research Associates, Rethinking the Information Highway: Security, Convergence and the E-
Consumer/E-Citizen. Presentation to Treasury Board of Canada Secretariat. December 2000.
4
PricewaterhouseCoopers, Consumer Technology Study. November 2000.
5
EKOS Research Associates, Rethinking the Information Highway: Security, Convergence and the E-
Consumer/E-Citizen. Presentation to Treasury Board of Canada Secretariat. December 2000.
6
Ipsos-Reid, Canada’s Municipal eGovernment Conference and Exposition. Presentation by John
Wright, Senior Vice President, to MuneGov 2001. February 2001.
7
This rating compares to Kiosks (84), Walk-in (71), Single Window (63), Mail (62), Fax (61), and
Telephone (59). Erin Research, Citizens First 2000. Presentation to the Treasury Board of Canada. 2000.
8
EKOS Research Associates, Rethinking the Information Highway: Security, Convergence and the E-
Consumer/E-Citizen. Presentation to Treasury Board of Canada Secretariat. December 2000.
9
PricewaterhouseCoopers, Consumer Technology Study. November 2000.
10
Michelle d’Auray. Government Online: Serving Canadians in the Digital Age. Presentation to
delegates of North America Day. October 2000.
11
British Columbia: Information Science and Technology Agency (www.ista.gov.bc.ca/InfoSmart.htm);
Alberta: One Window Initiative (www.gov.ab.ca/cio/index.html); Saskatchewan: Information Technology
Office, Economic and Co-operative Development; Manitoba: Office of Information Technology
(www.oit.gov.mb.ca/index1.html); Ontario: Office of the Corporate Chief Information Officer
(www.cio.gov.on.ca/mbs/cio/cio.nsf/home); Quebec: Autoroute de l’information
(www.autoroute.gouv.qc.ca); New Brunswick: Service NewBrunswick (www.gov.nb.ca/snb/e/index.htm);
Nova Scotia: Service Nova Scotia and Municipal Relations (www.gov.ns.ca/snsmr); Prince Edward Island:
Information Technology Management Group, Provincial Treasury (www.gov.pe.ca/pt/itmg-
info/index.php3); Newfoundland: Information Technology Management Division, Treasury Board
(www.gov.nf.ca/exec/treasury/itm.htm); Nunavut: Informatics Planning and Systems Development,
Department of Finance and Administration; Yukon: Information Services Branch, Government Services;
Northwest Territories: Information Management, Financial Management Board Secretariat.
12
Edith Stokey and Richard Zeckhauser, A Primer for Policy Analysis. New York: W.W. Norton, 1978.
13
Policy Statement from Australian Prime Minister Howard, Investing for Growth. 1997.
14
Government of Canada, Speech from the Throne to open the 36th Parliament. October 12, 1999,
www.pco-bcp.gc.ca/throne99/throne1999_e.htm.
15
Hon. Chris Hodgson, Chair of Management Board Secretariat. Speech to open Showcase Ontario 2000,
September 5, 2000.
16
Government of Manitoba. Statement taken from http://www.oit.gov.mb.ca/egov.html in March 2001.
17
Momentum Research Group, Benchmarking the egovernment Revolution:Year 2000 Report on Citizen
and Business Demand. 2000. Note: Commissioned by NIC.
18
EKOS Research Associates, Rethinking the Information Highway: Security, Convergence and the E-
Consumer/E-Citizen. Presentation to Treasury Board of Canada Secretariat. December 2000.
19
Intergovernmental Advisory Board, Citizen Expectations for Electronic Government Services. 2000.
20
Erin Research Inc, Citizens First 2000 – Summary Report. Erin Research Inc. for the Public Sector
Service Delivery Council (PSSDC) and the Institute for the Public Administration of Canada (IPAC). 2000.
21
UK Electronic Service Delivery Progress Reports can be found at www.e-envoy.gov.uk/esd.htm
22
Australia’s Progress Reports can be found at www.govonline.gov.au/projects/strategy/progress.htm
23
Multimedia Victoria, Government Online Cabinet Progress Reporting System: Guidelines. June 1999.
24
Department of the Environment, Transport, and the Regions, Central Local Liaison Group, E-
Government: Local Targets for Electronic Service Delivery. 2000.
25
Department of the Environment, Transport, and the Regions, Central Local Liaison Group, E-
Government: Delivering Local Government Online: Milestones and Resources for the 2005 Target.
2001.
26
GartnerGroup, Four Phases of e-government. Research Note, 2000.
27
Deloitte Research, At the Dawn of e-Government: The Citizen as Customer. 2000.
28
KPMG, Leading the Transformation to E-Government: Seven Things You Need to Know. 2000.

Defining and Measuring Success in Canadian Public Sector ESD Page 55


Professor Jerry Mechling and Charles Vincent April 24, 2001
29
Janet Caldow, IBM, Seven E-Government Leadership Milestones. 2001.
30
Accenture, Government Portals: Global Point of View. Presentation, 2001.
31
ONCE Corporation. Lac Carling III: Stepping-up to Client-Centric Electronic Service Delivery. 1999.
Original seven-step model presented in this report has been updated in 2001 to better represent recent ESD
developments.
32
Accenture, eGovernment Leadership: Rhetoric vs Reality – Closing the Gap. 2001.
33
United Kingdom Central IT Unit, Modernising Government: Benchmarking Electronic Service
Delivery. March 2000.
34
United Kingdom Central IT Unit, Information Age Government: Benchmarking Electronic Service
Delivery. July 2000.
35
Under the 1993 Government Performance and Results Act (GPRA), agencies must submit annual
performance plans to Congress along with their budget requests. For details see OMB Circular A-11
(2000).
36
GPEA requires agencies by October 21, 2003, to provide for the (1) option of electronic maintenance,
submission, or disclosure of information, when practicable as a substitute for paper; and (2) use and
acceptance of electronic signature, where practicable.
37
See Section 3 Subsection 6 of Implementation of the Government Paperwork Elimination Act, OMB.
Available at (www.cio.gov/docs/gpea2.htm).
38
For more information about the CMT see Faye Schmidt, Client Satisfaction Surveying: Common
Measurement Tools. Ottawa: Canadian Center for Management Development, December 1998.
39
This rating compares to Kiosks (84), Walk-in (71), Single Window (63), Mail (62), Fax (61), and
Telephone (59). Erin Research, Citizens First 2000. Presentation to the Treasury Board of Canada. 2000.
40
For example, see Cooper, Mark N, Disconnected, Disadvantaged, and Disenfranchised: Explorations
in the Digital Divide. Report of Consumer Federation of America. October 2000; Crandall, Robert W.
“Bridging the Divide – Naturally.” The Brookings Review. 19:1. Winter 2001.; and Dun Rappaport,
Catherine Crawford, Bridging the Digital Divide: Strategies and Solutions for the 21st Century. Paper
prepared in completion of Master of Public Policy, JFK School of Government, 2001.
41
Kablenet.com, “Tories dismiss e-government targets.” Kablenet.com, 21 March 2001.
42
Local Government Association, Opportunity to Prosper, e-government: a revolution or business as
normal. 2000.
43
Australian National Audit Office, Electronic Service Delivery, including Internet Use, by
Commonwealth Agencies. 1999.
44
At the Governance in the 21st Century conference hosted by the National Academy of Public
Administration in December 2000, participants explored how e-government might re-engage citizens. For
more see www.govtech.net/publications/21stCentury/civic.phtml

Defining and Measuring Success in Canadian Public Sector ESD Page 56


Professor Jerry Mechling and Charles Vincent April 24, 2001

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy