Kennerley2002 PDF
Kennerley2002 PDF
Kennerley2002 PDF
Article information:
To cite this document:
Mike Kennerley, Andy Neely, (2002),"A framework of the factors affecting the evolution of performance measurement systems",
International Journal of Operations & Production Management, Vol. 22 Iss: 11 pp. 1222 - 1245
Permanent link to this document:
http://dx.doi.org/10.1108/01443570210450293
Downloaded on: 09-07-2012
References: This document contains references to 34 other documents
Citations: This document has been cited by 17 other documents
To copy this document: permissions@emeraldinsight.com
This document has been downloaded 6377 times since 2005. *
Leslie Willcocks, (2011),"Machiavelli, management and outsourcing: still on the learning curve", Strategic Outsourcing: An
International Journal, Vol. 4 Iss: 1 pp. 5 - 12
http://dx.doi.org/10.1108/17538291111108408
Mike Bourne, Andy Neely, Ken Platts, John Mills, (2002),"The success and failure of performance measurement initiatives:
Perceptions of participating managers", International Journal of Operations & Production Management, Vol. 22 Iss: 11 pp. 1288 -
1310
http://dx.doi.org/10.1108/01443570210450329
Access to this document was granted through an Emerald subscription provided by GEORGE MASON UNIVERSITY
For Authors:
If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service.
Information about how to choose which publication to write for and submission guidelines are available for all. Please visit
www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
With over forty years' experience, Emerald Group Publishing is a leading independent publisher of global research with impact in
business, society, public policy and education. In total, Emerald publishes over 275 journals and more than 130 book series, as
well as an extensive range of online products and services. Emerald is both COUNTER 3 and TRANSFER compliant. The organization is
a partner of the Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.
*Related content and download information correct at time of download.
The current issue and full text archive of this journal is available at
http://www.emeraldinsight.com/0144-3577.htm
IJOPM
22,11 A framework of the factors
affecting the evolution of
performance measurement
1222
systems
Mike Kennerley and Andy Neely
Centre for Business Performance, Cranfield School of Management,
Cranfield, UK
Keywords Performance measurement, Development, Organizational change
Abstract The effectiveness of performance measurement is an issue of growing importance to
industrialists and academics alike. Many organisations are investing considerable amounts of
resource implementing measures that reflect all dimensions of their performance. Consideration is
being given to what should be measured today, but little attention is being paid to the question of what
should be measured tomorrow. Measurement systems should be dynamic. They have to be modified
as circumstances change. Yet few organisations appear to have systematic processes in place for
managing the evolution of their measurement systems and few researchers appear to have explored
the question, what shapes the evolution of an organisation's measurement system? The research
reported in this paper seeks to address this gap in the literature by presenting data that describes the
forces that shape the evolution of the measurement systems used by different organisations.
Introduction
Although it has long been recognised that performance measurement has an
important role to play in the efficient and effective management of
organisations, it remains a critical and much debated issue. Significant
management time is being devoted to the questions ± what and how should we
measure ± while substantial research effort, by academics from a wide variety
of management disciplines, is being expended as we seek to enhance our
understanding of the topic and related issues (Neely, 1999).
Survey data suggest that between 40 and 60 per cent of companies
significantly changed their measurement systems between 1995 and 2000
(Frigo and Krumwiede, 1999). Most of these initiatives, however, appear to be
static. Although many organisations have undertaken projects to design and
implement better performance measures, little consideration appears to have
been given to the way in which measures evolve following their
implementation (Waggoner et al., 1999). It is important that performance
measurement systems be dynamic, so that performance measures remain
relevant and continue to reflect the issues of importance to the business (Lynch
and Cross, 1991).
International Journal of Operations &
Production Management, The authors are grateful to the Engineering and Physical Sciences Research Council (EPSRC)
Vol. 22 No. 11, 2002, pp. 1222-1245.
# MCB UP Limited, 0144-3577
for the award of research grant number GR/K88637, to carry out the research reported in this
DOI 10.1108/01443570210450293 paper.
In order to ensure that this relevance is maintained, organisations need a Performance
process in place to ensure that measures and measurement systems are measurement
reviewed and modified as the organisation's circumstances change (Dixon et systems
al., 1990). Yet few organisations appear to have systematic processes in place
for managing the evolution of their measurement systems and few researchers
appear to have explored the question ± what shapes the evolution of an
organisation's measurement system. 1223
The research reported in this paper seeks to address this gap in the literature
by presenting a framework that describes the forces that shape the evolution of
the measurement systems used by different organisations.
Following this introduction the paper consists of a further six sections. The
next section discusses the literature regarding the evolution of performance
measurement systems, providing the context for the research. Descriptions of
the research methodology, the case study findings and the resultant framework
of factors affecting the evolution of performance measures are then presented.
The subsequent discussion is followed by conclusions that are drawn in the
final section.
Methodology
A multiple case study approach was used to investigate the way in which
performance measures actually evolve within organisations. The research
involved semi-structured interviews with a total of 25 managers from a range
Figure 1.
Summary of factors
affecting evolution
drawn from the
literature
IJOPM of management functions, from seven different organisations. The companies
22,11 involved in the research were from the industries shown in Table I.
The interview structure was designed to investigate the key themes
identified from the literature reviewed. As such the case studies sought to
answer the following questions:
. What factors encourage the introduction of new measures, modification
1228 of existing measures and deletion of obsolete measures?
. What factors inhibit the introduction of new measures, modification of
existing measures and deletion of obsolete measures?
The companies were selected on the basis of their considerable experience in
the implementation and use of performance measures. Companies from
different industry sectors and with a wide variety of competitive and
organisational characteristics were deliberately chosen to introduce diversity
into the sample. This enabled the identification of factors affecting evolution of
measurement in a variety of different circumstances. Similarly, interviewing
managers from a number of different departments ensured that consideration
was given to the diversity of factors affecting evolution in different functional
circumstances. As a result the findings of the case studies provide a broad
understanding of the factors affecting the evolution of an organisation's
performance measures.
Company Industry
Company 2
Although performance measurement systems had been implemented in
company 2 for a number of years, failure to actually use new performance
measures to manage the business was seen as major barrier to their deployment
and hence evolution. Although senior management had backed the
implementation of a balanced set of measures, the continued emphasis on
financial performance measures prevented use of the balanced measurement
system being embedded throughout the organisation. As in company 1,
company 2 used experiences of ineffective measurement practices in the past to
design a measurement system with the attributes that they considered
necessary to maintain a relevant set of performance measures in the future. To
ensure that their measures remained relevant, managers in company 2 explicitly
included a review of measures in the periodic review of business processes.
The head of business process development highlighted the importance of
having the appropriate systems to facilitate measurement activity and the
evolution of measurement systems. ``New systems have been designed from
scratch to be flexible enabling measures to be changed easily. The system
being Web-based enables worldwide access to all information allowing
IJOPM information sharing. This facilitates benchmarking and the transfer of best
22,11 practice. The global availability of the same reporting systems enables
commonality of approach''.
Furthermore he highlighted that: ``reporting needs to be efficient to reduce
the resources required to administer measurement, allowing resources to be
dedicated to acting on the results.'' The system was designed to enable efficient
1230 and effective data collection and reporting, minimising the effort of
measurement to ensure acceptance throughout the organisation.
According to the consultancy sales manager: ``Benchmarking of
performance against competitors (including those in new markets) has given a
common understanding of the need to improve and where improvement should
be focused. This has reduced any resistance to the change of performance
measures as the need can be demonstrated.'' This enabled the organisation to
overcome some of the people issues that had limited the development of
performance measurement activities in the past.
Company 3
The evolution of measures was not effectively managed in company 3. ``The
culture at [company 3] is a barrier to the implementation of a consistent
approach to measurement across the whole company.'' The ad hoc approach to
performance measurement that was adopted led to inconsistency in approaches
between different business units and geographical locations. The inconsistency
in measurement practices limited the comparability of performance data,
detrimentally affecting the credibility, and hence acceptance, of performance
measures. Despite attempts to change measures to reflect changing business
circumstances, managers were reluctant to use non-financial data to manage
the business. ``The overriding factor affecting the acceptance of performance
measurement is that it become a business issue so that it occupies the minds of
managers and measures are used to manage the business'' (Manager ±
Stationary Office Supplier). This reflects the need for managers to actively use
measures to manage the business. It was found that this would increase their
desire to ensure measures changed to remain appropriate, as their performance
would be assessed on them.
Inflexible IT systems were also found to be a major barrier to evolution. The
European customer care manager specifically noted that: ``it is not possible to
change the structure and content of the performance reports produced by the
mainframe IT system.''
Company 4
The use of performance measurement to manage the business was accepted in
company 4. However, the tendency to report too much data and produce too
many measurement reports acted as a significant barrier to evolution. The
service recovery manager stated: ``I spend too much time preparing reports for
my manager to take to board meetings. It prevents me from reviewing and
updating measures so that they remain current. Most of the reports are never
referred to, they are just a security blanket in case he is ever asked to produce Performance
the data.'' In the past key individuals had stood in the way of the use of some measurement
measures. ``This resistance was due to reluctance to provide a better systems
understanding of actual performance for which they were responsible. Removal
of the individuals has been the most successful way of eliminating the
problem'' (Service Recovery Manager).
The availability of people with the appropriate skills to analyse and redefine 1231
measures was also identified as an issue. This was particularly the case when
individuals responsible for measurement left departments or the company all
together. It was recognised that measurement practices could be developed
further by planning skills development and ensuring that the appropriate skills
were maintained in the areas they were required.
Company 4 also provided an example of the effect of the design of measures
on their use. While discussing the graphical representation of one measure, the
field service manager explained: ``nobody uses this measure as they don't
understand it. I would explain it to you but I don't understand it either''. As a
result the measure was not seen as relevant and was not used.
Company 5
Extensive performance measurement implementation had been undertaken in
company 5. However, as in company 2, although senior management had
initiated the implementation of new measures, they failed to use the resultant
performance measurement data, in favour of traditional financial performance
measures. ``The previous CEO paid lip service to the scorecard but only really
focussed on the financials, hence this is where all attention was focused'' (Head
of Strategic Planning). As a result the new measures were not considered to be
important at other levels of the organisation and they were not effectively used.
Measurement reverted to financial measurement and the process of evolution
was stifled. This clearly demonstrated the need for top level support for
measurement and the need for a change in mindset of management so that
measures are used to manage the business.
Company 6
Company 6 provided the best example of managing the evolution of
measurement systems. The primary factor facilitating evolution was the
availability of resources dedicated to measurement and the management of
performance measures. ``The availability of a dedicated employee who is
responsible for the review of measures enables gaps to be identified and the
need to change existing measures as well as identifying performance
measures'' (Sales Order Manager).
The dedicated systems analyst ensured that measures were reviewed and
that action was taken to improve performance and ensure that measures were
changed to remain relevant. In addition, ``having split responsibility and
budget from operations and the IT department enables me to develop systems
that would not be justified under either department individually''. This ensured
IJOPM that systems were flexible enough to change as required. The availability of a
22,11 manager dedicated to measurement, who had credibility within all areas of the
business stimulated measurement activity and helped overcome barriers to the
acceptance and evolution of measurement, such as inflexible payroll structures
and high staff turnover.
Company 6 highlighted the need to create the appropriate environment in
1232 which the use of performance measures is most effective. Weekly meetings to
review performance were open and honest discussions of performance,
including new issues requiring measurement and identifying new areas of
performance on which to focus improvement attention. ``It is important to
recruit and retain employees who are open to new ideas and are willing and
able to implement new performance measures.'' ``Use of neutral measures, that
focus on improvement and do not apportion blame, help acceptance and
adoption of measures.''
Company 7
The lack of a formal review process was considered to be the main reason that
the evolution of performance measures was not managed in company 7 (``There
is no process to review measures and identify whether or not they are
appropriate. That is a major factor affecting whether measures change in line
with organisational circumstances''). Within company 7 the leadership of the
managing director was clearly the main driver of measurement activity. ``The
ability and energy of the managing director drive measures and measurement.
He prompts other board members to review measures and ensure that they are
relevant and appropriate to the business and reflect what is important.''
The availability of management time to reflect on measures was also
considered to be a major constraint. The group technical and quality director
identified that: ``In previous years we have had too many measures. We need to
focus on fewer important objectives''. He also noted that the frequency with
which measures are reviewed is dependent on the availability of management
time. Similarly the availability of management skills is also a key determinant
of the ability to review and modify measures. This will affect when
inappropriate measures are identified and the ability to change measures to
make them appropriate''. He identified the need for systems that could
accommodate a hierarchy of measures, reporting the few important measures,
but enabling analysis of the many underlying measures of the drivers of
performance.
Table II summarises the key factors that facilitate and inhibit the evolution
of performance measurement systems in each of the case study companies.
Evidence from the case study companies demonstrates the need for
companies to change their performance measures as the organisation's
circumstances change. The group technical and quality director in company 7
pointed out: ``If people don't think measures are relevant they won't use them,
so they won't evolve''. This clearly demonstrates that in order for an
organisation to have performance measures that evolve over time, they must
Company Facilitators of evolution Barriers to evolution
1 Senior management driving measurement activities Off the shelf systems insufficiently flexible
Development of in-house IT systems Availability of skills to effectively collect and analyse data
Use of accepted communication media to communicate,
generate feedback and involve all employees
Integration of measurement with strategy development and
review
Consistent approach to measurement
2 New Web-based system developed Senior management inertia
In-house systems provide required flexibility Measures not used to manage the business
Measurement included in business process review Time consuming and costly data collection
Alignment of rewards to measures
Need for measures to evolve considered important
Common understanding of objectives and the need to improve
3 Enthusiastic champion of measurement Management inertia
Contact with external research bodies to keep up to date with Inflexible IT/finance systems
developments in measurement practices Incompatibility of measures/inconsistent approach
Make measurement a business issue ± manage with measures Culture ± ad hoc measurement, no integrated approach or PM
function
4 Enthusiastic champion to kick off ``measurement revolution'' Individual inertia/resistance to measurement
The need for succession planning identified Time wasted producing reports
Ability to quantify performance
Measures lacking credibility
5 Top level management support is critical Measurement not used to manage the business (need new
User involvement in designing measures mind set)
Alignment of rewards Accounting systems focus
Inconsistent approach to measurement (due to changes in
ownership and management)
Lack of flexible systems to collect and analyse data
(continued)
1233
22,11
1234
IJOPM
Table II.
Company Facilitators of evolution Barriers to evolution
1238
findings
IJOPM
Table III.
Recategorised
summary of case study
Facilitators of evolution Barriers to evolution
Process Integration of measurement with strategy development and review Lack of proactive review process (7)
(company 1) Inconsistent approach to measurement:
Integration of measurement with business process review (2) over time (5)
PM ``function'' the focal point of measurement activity (6) between locations/business units (3, 6, 7)
Forum to discuss appropriateness of measures (6) no integrated measurement function (3)
Implementation of common definitions/metrics (3, 7) Insufficient time to review measures:
Consistent approach to measurement across all areas of the business (1) lack of management time (4, 7)
Away day to measures (6) too much data reported (4, 7)
Involvement of external bodies (3) The need to trend measures limits ability change (7)
User involvement in measurement (5) Lack of data analysis (5, 6)
Culture The need for evolution considered to be important (2, 6, 7) Senior management inertia (2, 3)
Communication: Individual inertia/resistance to measurement (4)
use of accepted medium (1) Ad hoc approach to measurement (3)
feedback all actions (1) Lack of alignment of actions with measures (7)
engage all employees (1) In appropriate use of measures/measures not used to
Measurement integrity is encouraged: manage the business (2, 5)
open and honest discussion of performance (6) Rigid remuneration and union systems (6)
no blame culture (6)
discouragement of ``gaming behaviour'' (6)
Ongoing senior management support/champion for measurement (all
companies):
continued focus on measurement (1, 6)
identify and remove barriers to use/change of measures (1, 6)
Establish common understanding of objectives (2)
Integration/alignment of reward systems (2)
Measurement not owned by finance (6)
Alignment of measures and rewards (2, 5, 6)
Table III.
Performance
measurement
systems
1239
IJOPM examples of internal triggers which prompted review of the relevance of
22,11 current measures given changes in circumstances. Other such triggers were
also identified that prompted the realisation that measures were
inappropriately designed for their purpose, that use of measures prompted
inappropriate behaviour or that circumstances, such as competitive
requirements, changed. Once the trigger has been received then the first stage
1240 in the evolution of the measurement system is to reflect on the performance
measurement system and identify whether it remains appropriate given
changing organisational circumstances. This stage of the evolutionary process
is known as reflect (reflect) and the research identified several barriers that
prevent it from occurring in organisations, most crucially those associated with
process, people, infrastructure and culture:
. Absence of an effective process. Company 7 highlighted the lack of an
effective process as the main barrier to reflection, while in both
companies 4 and 7 there was insufficient management time set aside to
reflect on performance measures.
. Lack of the necessary skills and human resources. Companies 1, 4, 6 and 7
each identified a lack of appropriate skills to analyse data and identify
inappropriate measures. Company 6 specifically highlighted that high
staff turnover caused problems in retaining people with the skills
necessary to identify which measures are inappropriate. Company 4 also
highlighted that the lack of succession planning was a barrier to
reflection.
. Inflexible systems. These were identified as barriers to reflection. In
particular, company 6 found ERP system implementation led to lost
analysis functionality required to investigate performance trends and
causes of performance variances.
. Inappropriate culture. Companies 4 and 6 both highlighted individuals
who were resistant to reflection on and change to measures as they did
not want measures to more effectively reflect specific dimensions of
performance for which they were responsible. Lack of alignment of
measures with rewards was also found to be a barrier to reflection in
company 7. Alignment of measures with rewards ensures that those
being measured have an incentive to reflect on measures and prompt
their evolution.
During the reflection stage, each of the constituent parts of the performance
measurement system should be critically appraised and reviewed to ensure
that they remain appropriate to the requirements of the organisation. Many
tools and techniques have been developed to help organisations design
performance measures and measurement systems. Several of these tools can be
applied to reflect on the content of an organisation's current performance
measurement system. For example, the performance measurement record sheet
(Neely et al., 1996) lists the characteristics of a performance measure, any of
which might be affected by changes in the organisation's circumstances. Many Performance
of the performance measurement frameworks that have been proposed measurement
(Kennerley and Neely, 2000) might also support reflection on the relevance of systems
the set of measures used by the organisation. Furthermore, tools such as the
Performance Measurement Questionnaire (Dixon et al., 1990) are specifically
designed to help an organisation to identify the appropriateness of their
measurement system. 1241
Reflecting on the measurement system will enable required changes to be
identified and will in turn trigger modifications (modify). In addition external
triggers, such as changes in legislative or regulatory requirements, and/or
changes in ownership can lead to the imposition of new performance measures,
which will also prompt the modification stage. In turn the modification stage
will result in changes to the constituent elements of the measurement system.
Once these changes have been enacted then the modified measurement system
can be said to have been deployed (deploy) and hence the cycle of evolution can
start again. This entire evolutionary cycle is illustrated in Figure 2, which
contains a framework of the factors affecting the evolution of measurement
systems.
The key to this discussion is to recognise that the case study data collected
demonstrates that to manage effectively the evolution of performance
measures, an organisation must consider several inter-related issues:
(1) The active use of the performance measurement system is a pre-
requisite to any evolution.
(2) The performance measurement system itself consists of three inter-
related elements (individual measures, the set of measures and the
Figure 2.
Framework of factors
affecting the evolution
of performance
measurement systems
IJOPM enabling infrastructure). Each of these elements must be considered
22,11 during the evolution of the performance measurement system.
(3) There are four stages of evolution ± use, reflect, modify and deploy.
These form a continuous cycle.
(4) Barriers exist that will prevent the evolutionary cycle from operating.
1242 These barriers can be overcome if the evolutionary cycle is underpinned
by enabling factors ± broadly categorised under the headings: people,
process, people, infrastructure and culture. Specifically, a well designed
measurement system will be accompanied by an explicitly designed
evolutionary cycle with clear triggers and:
. process ± existence of a process for reviewing, modifying and
deploying measures;
. people ± the availability of the required skills to use, reflect on,
modify and deploy measures;
. infrastructure ± the availability of flexible systems that enable the
collection, analysis and reporting of appropriate data;
. culture ± the existence of a measurement culture within the
organisation ensuring that the value of measurement, and
importance of maintaining relevant and appropriate measures, are
appreciated.
Discussion
The literature and case study data presented clearly show first, the importance
of managing measurement systems so that they change over time and second,
the complex range of interrelated factors that affect the evolution of
performance measurement systems. The literature highlights many of the
issues affecting the management of change within organisations. This paper
discusses many of these issues in the context of case study data relating to
performance measurement system evolution.
A considerable amount has been written about the design and
implementation of measurement systems and a number writers have identified
the need to reflect on measures to ensure that they remain relevant as the
organisation changes. The research findings echo the themes identified in the
literature concerning the external and internal drivers of change affecting
organisations and the need for organisations to have effective process in place
to identify these changes and when they necessitate changes to measurement
systems. However, there is little discussion in the literature of what to do once
that reflection has taken place. The data collected clearly show that the process
of managing the evolution of measurement systems consists of a number of
stages that have to date received little attention. In addition to reflection,
consideration should be given to how measures are to be modified and how
modified measures are to be deployed without embarking on a wholesale
performance measurement system redesign project.
It is also clear that for measurement systems to evolve effectively there are Performance
key capabilities that an organisation must have in place (i.e. effective processes; measurement
appropriate skills and human resources; appropriate culture; and flexible systems
systems). The research demonstrates how lessons from different strands of
literature such as the need for the appropriate resources (Greiner, 1996) and
capabilities (Gabris, 1986); the appropriate culture (Tichy, 1983); willingness to
change (Kotter, 1996); and relevant processes (Bourne et al., 2000; Bititci et al., 1243
2000) can be drawn together into a structured framework.
The data indicates that organisations should consider these capabilities at
each stage of the evolutionary cycle, as they are fundamental to effective
evolution. However, little consideration is given to these capabilities in the
literature concerning the design and implementation of measurement systems.
It is the development and maintenance of these capabilities within an
organisation that will determine whether its measurement systems evolve
effectively. As such, reviewing the availability of these capabilities is an
important stage in the management of measurement systems over time. This
reflects the need to review and update measurement systems at three different
levels, i.e. the individual measure; the set of measures; and the supporting
infrastructure, and shows that these capabilities are an integral part of that
supporting infrastructure.
The framework presented provides a structured view of the factors affecting
the evolution of performance measures and measurement systems. It
conceptualises a very complex combination of factors affecting the evolution of
measurement systems into a manageable form.
Conclusions
Although the issue of development of effective performance measures has
received considerable attention from both academic and practitioner
communities, neither has satisfactorily addressed the issue of how performance
measures should evolve over time in order to remain relevant.
The research reported in this paper provides an understanding of how
measurement systems can be managed so that a dynamic and relevant set of
performance measures can be maintained, reflecting an organisation's
changing requirements. It provides an understanding of the factors, both
internal and external to the organisation, that facilitate and inhibit the
introduction of new measures, the modification of existing measures and
deletion of obsolete measures. These factors are presented in a framework that
illustrates the process, people, infrastructure and culture capabilities that an
organisation must demonstrate in order to manage the evolution of measures.
The paper discusses many issues of relevance to the growing literature in the
field of performance measurement while providing organisations with a
practical tool to help them establish an effective performance measurement
system. Ensuring that evolution of measurement systems is effectively
managed over time is important if another measurement crisis and revolution
is to be avoided.
IJOPM References
22,11 Bititci, U.S., Turner, T. and Begemann, C. (2000), ``Dynamics of performance measurement
systems'', International Journal of Operations & Production Management, Vol. 20 No. 6,
pp. 692-704.
Bourne, M., Neely, A., Mills, J. and Platts, K. (1999), ``Performance measurement system
implementation: an investigation of failures'', Proceedings of the 6th International
Conference of The European Operations Management Association, Venice, 7-8 June,
1244 pp. 749-56.
Bourne, M., Mills, J., Wilcox, M., Neely, A. and Platts, K. (2000), ``Designing, implementing and
updating performance measurement systems'', International Journal of Operations &
Production Management, Vol. 20 No. 7, pp. 754-71.
Bruns, W. (1998), ``Profit as a performance measure: powerful concept, insufficient measure'',
Performance Measurement ± Theory and Practice: The First International Conference on
Performance Measurement, Cambridge, 14-17, July.
Dixon, J.R., Nanni, A.J. and Vollmann, T.E. (1990), The New Performance Challenge ± Measuring
Operations for World-Class Competition, Dow Jones-Irwin, Homewood, IL.
Eccles, R.G. (1991), ``The performance measurement manifesto'', Harvard Business Review,
January-February, pp. 131-7.
Fitzgerald, L., Johnston, R., Brignall, T.J., Silvestro, R. and Voss, C. (1991), Performance
Measurement in Service Businesses, The Chartered Institute of Management Accountants,
London.
Frigo, M.L. and Krumwiede, K.R. (1999), ``Balanced scorecards: a rising trend in strategic
performance measurement'', Journal of Strategic Performance Measurement, Vol. 3 No. 1,
pp. 42-4.
Gabris, G.T. (1986), ``Recognizing management techniques dysfunctions: how management tools
often create more problems than they solve'', in Halachmi, A. and Holzer, M. (Eds),
Competent Government: Theory and Practice, Chatelaine Press, Burk, VA, pp. 3-19.
Ghalayini, A.M. and Noble, J.S. (1996), ``The changing basis of performance measurement'',
International Journal of Operations & Production Management, Vol. 16 No. 8, pp. 63-80.
Globerson, S. (1985), ``Issues in developing a performance criteria system for an organisation'',
International Journal of Production Research, Vol. 23 No. 4, pp. 639-46.
Greiner, J. (1996), ``Positioning performance measurement for the twenty-first century'', in
Halachmi, A. and Bouckaert, G. (Eds), Organizational Performance and Measurement in
the Public Sector, Quorum Books, London, pp. 11-50.
Johnson, H.T. (1983), ``The search for gain in markets and firms: a review of the historical
emergence of management accounting systems'', Accounting, Organizations and Society,
Vol. 2 No. 3, pp. 139-46.
Johnson, H.T. and Kaplan, R.S. (1987), Relevance Lost ± The Rise and Fall of Management
Accounting, Harvard Business School Press, Boston, MA.
Kaplan, R.S. (1984), ``The evolution of management accounting'', The Accounting Review, Vol. 59
No. 3, pp. 390-418.
Kaplan, R.S. and Norton, D.P. (1992), ``The balanced scorecard ± measures that drive
performance'', Harvard Business Review, January/February, pp. 71-9.
Kaplan, R.S. and Norton, D.P. (1993), ``Putting the balanced scorecard to work'', Harvard Business
Review, September/October, pp. 134-47.
Keegan, D.P., Eiler, R.G. and Jones, C.R. (1989), ``Are your performance measures obsolete?'',
Management Accounting (US), Vol. 70 No. 12, pp. 45-50.
Kennerley, M.P. and Neely, A.D. (2000), ``Performance measurement frameworks ± a review'', Performance
Proceedings of the 2nd International Conference on Performance Measurement,
Cambridge, pp. 291-8. measurement
Kotter, J.P. (1996), Leading Change, Harvard Business School Press, Boston, MA. systems
Lynch, R.L. and Cross, K.F. (1991), Measure Up ± The Essential Guide to Measuring Business
Performance, Mandarin, London.
Maskell, B. (1989), ``Performance measures for world class manufacturing'', Management 1245
Accounting (UK), May, pp. 32-3.
Meyer, M.W. and Gupta, V. (1994), ``The performance paradox'', in Straw, B.M. and Cummings, L.L.
(Eds), Research in Organizaional Behaviour, Vol. 16, JAI Press, Greenwich, CT, pp. 309-69.
Neely, A. (1998), Measuring Business Performance ± Why, What and How, Economist Books,
London.
Neely, A.D. (1999), ``The performance measurement revolution: why now and where next'',
International Journal of Operations and Production Management, Vol. 19 No. 2, pp. 205-28.
Neely, A.D., Kennerley, M.P. and Adams, C.A. (2000), The New Measurement Crisis: The
Performance Prism as a Solution, Cranfield School of Management, Cranfield.
Neely, A.D., Mills, J.F., Gregory, M.J., Richards, A.H., Platts, K.W. and Bourne, M.C.S. (1996),
Getting the Measure of Your Business, Findlay Publications, Horton Kirby.
Pettigrew, A. and Whipp, R. (1991), Managing Change for Competitive Success, Blackwell,
Oxford.
Scott, W.R. (1995), Institutions and Organizations: Theory and Research, Sage Publications,
London.
Senge, P.N. (1992), The Fifth Discipline: The Art and Practice of the Learning Organization,
Century Business Press, London.
Tichy, N.M. (1983), Managing Strategic Change: Technical, Political, and Cultural Dynamics, John
Wiley & Sons, New York, NY.
Townley, B. and Cooper, D. (1998), ``Performance measures: rationalization and resistance'',
Proceedings of Performance Measurement ± Theory and Practice: the First International
Conference on Performance Measurement, Cambridge, 14-17, July, pp. 238-46.
Waggoner, D.B., Neely, A.D. and Kennerley, M.P. (1999), ``The forces that shape organisational
performance measurement systems: an interdisciplinary review'', International Journal of
Production Economics, Vol. 60-61, pp. 53-60.
Wisner, J.D. and Fawcett, S.E. (1991), ``Linking firm strategy to operating decisions through
performance measurement'', Production and Inventory Management Journal, Third
Quarter, pp. 5-11.