3 Peer-Reviewed Articles

Download as pdf or txt
Download as pdf or txt
You are on page 1of 46

The current issue and full text archive of this journal is available on Emerald Insight at:

www.emeraldinsight.com/2056-4961.htm

Running the risk IT – more Running the


risk IT
perception and less probabilities
in uncertain systems
Adrian Munteanu 345
Faculty of Economics and Business Administration,
Alexandru Ioan Cuza University, Iasi, Romania Received 19 July 2016
Revised 26 November 2016
Accepted 4 January 2017

Abstract
Purpose – This study aims to argue that in the case of quantitative security risk assessment, individuals do
not estimate probabilities as a likelihood measure of event occurrence.
Design/methodology/approach – The study uses the most commonly used quantitative assessment
approach, the annualized loss expectancy (ALE), to support the three research hypotheses.
Findings – The estimated probabilities used in quantitative models are subjective.
Research limitations/implications – The ALE model used in security risk assessment, although it is
presented in the literature as quantitative, is, in fact, qualitative being influenced by bias.
Practical implications – The study provides a factual basis showing that quantitative assessment is
neither realistic nor practical to the real world.
Originality/value – A model that cannot be tested experimentally is not a scientific model. In fact, the
probability used in ISRM is an empirical probability or estimator of a probability because it estimates
probabilities from experience and observation.
Keywords Risk analysis, Probability, Risk, Risk management
Paper type Case study

1. Introduction
The study of information systems security risk management (ISRM) has been a constant
concern for both academics and practitioners for over 20 years. They have developed
standards, best practices, techniques and models classified as qualitative or quantitative.
The aim of this research is to argue that in the case of quantitative security risk
assessment, individuals do not estimate probabilities but rather “something” else. We argue
that the estimated probabilities used in quantitative models presented in research articles,
standards or best practices are subjective and not probabilistic. In most quantitative risk
assessment models, we observe an abbreviated process that ignores significant theoretical
and practical details. Therefore, the assessment of results may lead to incorrect conclusions.
The study starts from the conclusions made by Slovic (2011) who stated that the
definition of risk is elusive and independent of human mind and culture. He argued that risk
was invented by people who wanted to overcome dangers in life. He denied the existence of
real or objective risk.
This research argues that quantitative assessment of risk in information systems cannot
be objectively performed in an uncertain world in the absence of more information than is
currently available to us. Information & Computer Security
The paper is organized as follows: in the first section, we provide the research Vol. 25 No. 3, 2017
pp. 345-354
hypotheses. The next section contains a literature review on quantitative analysis in © Emerald Publishing Limited
2056-4961
security risk assessment. The third section presents the most commonly used quantitative DOI 10.1108/ICS-07-2016-0055
ICS assessment approach – annualized loss expectancy (ALE) – to support the hypotheses.
25,3 Finally, we discuss the limitations of quantitative risk assessment.

2. Research hypotheses
The following hypotheses were formulated in this research:
H1. Even if they are defined as “quantitative”, most models are, in fact,
346 “qualitative”.
H2. In practice, it is almost impossible to use objective probabilities for security risk
assessment.
H3. Within uncertain systems, such as information systems, risk assessment is more
about perception.
Each of the above-mentioned hypotheses is derived from the previous one. To prove each of
the hypotheses, we refer to the most commonly used quantitative assessment approach:
ALE. This approach is used to assess the probability of damage from different types of
attacks and to support the cost-benefit analysis. We applied this model to a real asset: 1 GB
of documents stored on a laptop with Windows XP and Office 2010. For reasons of
simplicity, our demonstration will consider only a single threat, “corruption of stored
documents”, and only a single vulnerability, “zero-day exploit”. The study actually embeds
the concepts into the ALE model.

3. Literature review on quantitativism in security risk assessment


What research articles are viewed as representative for ISRM?
There is a great amount of literature published in academic and professional journals
that must be read to conduct a reasonable and objective review of this topic. Representative
journals in the field, based on impact factor, are shown in Table I.
An inventory of all articles published between 2004 and 2015, which address the topic of
quantitative risk in information system security, is shown in Table II. The table excludes
articles published in conference proceedings and those dealing with specific information
systems, such as SCADA, cloud computing and social networks. Whole articles were
reviewed – rather than key words and abstracts.

No. Editor Journal details Impact factor

1 IEEE IEEE Transactions on Information Forensics and Security 2.408


Monthly
2 Elsevier Computers and Security 1.031
Bimonthly
3 Springer International Journal of Information Security 0.963
Bimonthly
4 IET IET Information Security 0.753
Bimonthly
5 IEEE IEEE Security and Privacy 0.731
Bimonthly
Table I. 6 Wiley Security and Communication Networks 0.720
Relevance of journals Monthly
based on their impact 7 ACM ACM Transactions on Information and System Security 0.690
factor Quarterly
IEEE Security and Privacy
Running the
]May-June 2010 Hole, K.J.; Netland, L.-H. Toward Risk Assessment of risk IT
Large-Impact and Rare Events
Jan.-Feb. 2011, Johnson, M.E.; Pfleeger, S.L. Addressing Information Risk in
Turbulent Times
Computers and Security – Elsevier
Computers and Security (2005)24 Mariana Gerber, Rossouw von Management of risk in the 347
Solms information age
Computers and Security (2005)24 Theodosios Tsiakis, George The economic approach of
Stephanides information security
Computers and Security (2005)24, Bilge Karabacak, Ibrahim ISRAM: information security risk
Sogukpinar analysis method
Computers and security 31 (2012) Hyeun-Suk Rhee, Young U. Ryu, Unrealistic optimism on
Cheong-Tag Kim information security
management
Computers and security 44 (2014) Jeb Webb, Atif Ahmad, Sean B. A situation awareness model for
Maynard, Graeme Shanks information security risk
management
Computers and security 55 (2015) Kresimir Solic, HrvojeOcevcicb, The information systems’
Marin Golub security level assessment model
based on an ontology and
evidential reasoning approach Table II.
Computers and security 57 (2016) Alireza Shameli-Sendi, Rouzbeh Taxonomy of information Inventory of articles
Aghababaei-Barzegar, Mohamed security risk assessment (ISRA) published between
Cheriet 2004-2015

A common topic emerging from the articles analyzed was that quantitative risk
assessment models are more difficult to use than qualitative ones. For example, the final
article in Table II, which is a review of 125 articles published between 1994 and 2014,
concluded that quantitative models are too imprecise and can lead to failures in
appropriate risk management.
Our own research on the literature revealed that the “relevance” hypothesis used in
selecting items was not correct at least for three reasons:
(1) scientific relevance of journal (impact factor) may change over time;
(2) information security articles were not published only in the selected journal; and
(3) risk assessment is a subject increasingly debated in behavioral economics,
neuro-economics and cognitive psychology articles.

Paul (2008) states that there is no irrefutable scientific evidence that proves that an article
that has been published in a journal with an impact factor or article influence score under a
certain level is less important than an article with higher influence score. Specialized
literature does not provide such evidence, nor does practice. The author also argues that
scientometric indicators do not offer uncontested certification of the quality of the
disseminated knowledge.
For this reason, in the second phase, we went beyond reviewing articles published
in specialized journals with an impact factor, to include analysis of conferences or
journals beyond the field of information systems. Thus, an article published by
Verendel 2009) examined 90 publications that appeared between 1981and 2008,
comprising the following:
ICS  security models: formal representation of operational security aimed to represent
25,3 quantitative information of interest;
 security metrics: quantitative measurement-based indicators for various targets
in operational security; and
 security frameworks: general methods for quantitative analysis of security (not
necessarily specific models or metrics).
348
The article concludes that most of the assumptions underlying the published methods have
not been tested in the operational area. Analyzing the list of references mentioned in the
article, we found that most of the articles were published in conference or workshop
proceedings. With reference to quantitative models/probability, Richard Taylor argues that
current methods of assessing security risks are incorrect because management decisions are
based on optimistic perceptions and heuristics (Taylor, 2015).
Defining “quantitative” is different in terms of standards. It can be noted in Table III.
Different approaches used in the definitions of risk are analyzed from the historical
perspective by Terje Aven. Regarding the definition of risks from a probabilistic
perspective, the author concludes that they would be rejected because they only make
sense in the case of repeatable and not unique, and that they describe the risks’
association with subjective probabilities (Aven, 2012).

4. Research: the limitations of the annualized loss expectancy model


The ALE model was mentioned for the first time in 1974 in Federal Information
Processing Standards (FIPS, 1974) Publication no 31: Guidelines for Automating Data
Processing Physical Security and Risk Management. In 2005, these guides were
withdrawn and replaced with NIST Special Publication 800-30. Guide for Conducting
Risk Assessments is a document which does not make any reference to any risk
assessment model.
The single loss expectancy (SLE) is the total amount of revenue that is lost by an
organization from a single occurrence of a risk. It is a monetary amount that is assigned
to a single event that represents the company’s potential loss amount if a specific threat
exploits vulnerability. (The SLE is similar to the impact of a qualitative risk analysis.)
The exposure factor (EF) represents the percentage of loss that a realized threat could
have on a certain asset.
The annualized rate of occurrence (ARO) is the number of times that the assessor
reasonably expects the risk to occur during one year.
The ALE is the total amount of money that the organization will lose in one year if
nothing is done to mitigate the risk.
We apply this model to a real asset: 1 GB documents stored on a Lenovo Yoga 11 S
laptop with Windows XP, Office 2010. In our example, we consider that the threat is

Standard Definitions

NIST Special Publication 800-30 Uses the term “Semi-Quantitative Values” for evaluation scales
Revision 1 – Guide for Conducting of probabilities; it is the highest standard from a language
Table III Risk Assessments perspective
Different defintion ISO 27005 – Risk management – Risk Uses only qualitative scales in ranking probabilities
for ‘quantitative’ assessment techniques
“corruption of stored documents” and the vulnerability is “zero-day exploit”. We will be Running the
using the concepts in the model. In real situations, formulas must be applicable to all risk IT
assets of the organization and all threats that exploit vulnerabilities in the individual
assets.

4.1 H1 demonstration 349


As described in Landoll and Landoll (2005), Bojanc and Blazic (2008), Mano (2013) and
(Gibson, 2014), the SLE is calculated by multiplying the asset value (AV) by the
exposure factor (EF):
SLE = AV  EF.
The formula introduces the first quality element. While for AV we can use the objective
information obtained from accounting (Munteanu et al., 2008), for EF we do not have such a
basis. The only identified study is published by the SANS Institute in 2002 (Tan, 2002) and
has also a subjective method/qualitative to estimate EF:
“Start off with 100 per cent for the starting exposure factor and answer each of the
following questions:
(1) Does the system under attack have any redundancies/backups/copies?
 Subtract 30 per cent if the answer is YES.
(2) Is the system under attack behind a firewall?
 Subtract 10 per cent if the answer is YES.
(3) Is the attack from outside?
 Subtract 20 per cent if the answer is YES
(4) What is the potential rate of attack? (10 per cent damage/hour vs. 10 per cent
damage/min).
 Subtract 20 per cent if the answer is less than 20 per cent damage/hr.
 Subtract 40 per cent if the answer is less than 2 per cent damage/hr.
(5) What is the likelihood that the attack will go undetected in time for a full
recovery?
 Subtract 10 per cent if the probability of being undetected is less than 20
per cent.
 Subtract 30 per cent if the probability of being undetected is less than 10
per cent.
(6) How soon can a countermeasure be implemented in time if at all?
 Subtract 30 per cent if the countermeasure can be implemented within ½
hour.
 Subtract 20 per cent if the countermeasure can be implemented within 1
hour.
 Subtract 10 per cent if the countermeasure can be implemented within 2
hours.
In Table IV, we present calculations applied to the asset taken as an example.
For these reasons, we consider the following:
 There is no method/formula clearly describing the calculation of EF (Sendi et al.,
2016);
ICS  EF cannot be expressed as a simple percentage on material exposure to an asset
25,3 because the exposure is in turn dependent on many technical and economic
factors, type one (asset) – to-many (threats);
 SLE is the estimate of what a company would lose each year assuming that EF
remains unchanged throughout the year;
 we have no information that the software a company uses contains or not
350 vulnerabilities; and
 we just assume and recognize the vulnerability only after the release of the exploit.

4.2 H2 demonstration
How many times could the laptop have been affected by an event of “ransom ware”
(crypto locker) in 2013? Presently, it is easy to ask such a question. In 2013, we did not
have such information because the threat was only discovered at the end of the year.
(Godin, 2013). Where could we find information on the likelihood of such events? One
possible answer would be the experience of an expert because there is little actuarial
data (Bojanc and Blazic, 2008). No technologies were prepared to provide protection
against such threats: security solutions such as advanced threat detection were at an
early stage. In our demonstration, we assume the existence of this type of threat for the
year 2013, an annual rate of occurrence equal to 4, and, for 2015, ARO will be 12
(Table V).
Is it possible that for the same asset, whose accounting value depreciates over time, to
have different losses? The calculations in the Table V demonstrate that it is possible because
there is no tool to identify an objective ARO: if there is no historical data in the organization,
should we have to rely on those provided by the manufacturers? In addition, due to bias,
these calculations do not allow a comparative analysis.

4.3 H3 demonstration
Developed by Daniel Kahneman, the winner of Nobel Memorial Prize in Economics, the
prospect theory states that people make decisions based on the potential value of losses and
gains rather than the final outcome, and that people evaluate these losses and gains using
certain heuristics and not probabilities (Kahenman, 2011). He describes two different ways
in which the brain forms thoughts:

Asset: 1 GB documents stored on Lenovo Yoga 11S laptop with Windows XP, Office 2010
Year AV (US$) EF (%) SLE (US$)
Table IV. 2013 1200 .2 240
SLE calculation 2015 650 .2 130

Asset: 1 GB documents stored on Lenovo Yoga 11S laptop with Windows XP, Office 2010
Year AV (US$) EF (%) SLE (US$) ARO (annual probability) ALE (US$)
Table V. 2013 1,200 0.2 240 4 960
ALE calculation 2015 650 0.2 130 12 1,560
(1) System 1: Fast, automatic, frequent, emotional, stereotypic and subconscious; Running the
and risk IT
(2) System 2: Slow, effortful, infrequent, logical, calculating and conscious.

The two systems could be mapped in the World 2 and 3 of “The Three Worlds of
Knowledge” epistemological model of Popper (1980):
(1) World 1: the world of physical objects and events, including biological entities; 351
(2) World 2: the world of mental objects and events; and
(3) World 3: objective knowledge.

Citing theories in cognitive psychology and neuroscience, Paul Slovic indicates that there
are two fundamental ways in which human beings comprehend risk:
(1) the “analytic system” that uses algorithms and normative rules, such as the
probability calculus, formal logic and risk assessment; and
(2) the “experiential system” that is intuitive, fast, mostly automatic and not very
accessible to conscious awareness.

Back in 1921, Keynes published A Treatise on Probability. Keynes shows that probability
can only be calculated to the precision of a single number answer in certain cases. In many
more cases, it can only be described in terms of what he calls an “interval estimate” between
points (Keynes, 1921).
Information systems are non-linear socio-economic systems. This is because the behavior
of these systems is not repeatable, and qualitative properties are therefore more important
(Korn and Takats, 2013).
When we discuss about threats we have no information about how many are random
or targeted. For random threats, the uncertainty is irreducible because we are unable to
obtain information from measurements, tests or experiments on unknown threats.
To make judgments about uncertain events, three types of approach can be used as in
Tversky and Kahneman (1974). These may be summarized as follows:
(1) representativeness – in which people are asked to judge the probability that an
event belongs to one or other class or type;
(2) scenario-based – in which people are asked to assess the likely frequency of an
occurrence; and
(3) anchor-based – in which an assessment is made about the likelihood of an event
differing from an anchor point (for example – more or less frequently than once
a month).

All the three heuristics could be applied on the ALE model.

5. Discussion
Hayek’s assertion made in 1974 related to economics is also applicable to all sciences. Hayek
stated: “the credit which apparent conformity with recognized scientific standards can gain
for seemingly simple but false theories may [. . .] have grave consequences.”
We find that the results of fundamental research in the ISRM are not found in
international standards or best practices. NIST and ENISA (European Network
Information Security Agency) no longer reference any quantitative models. Although
ICS ALE is mentioned as part of the calculation of return on security investment (ROSI),
25,3 there is a warning that what is included in any calculation must depend on the
organization’s objectives (ENISA, 2012).
There is an irrefutable need for risk estimation. But applying the models that need to
be quantitative without a scientifically correct basis leads to inaccurate estimates. It is
the role of scientists to confirm the actual validity of models through practical
352 experiment.
As an example, we will attempt to assess risks using ALE for a medium-sized
company with 200 employees and 500 tangible assets. Even if we had historical data
available about threats, vulnerabilities and incidents the problems are far from trivial
as a single vulnerability can affect many assets and a single threat can exploit many
vulnerabilities. Furthermore, it is impossible to know historically how many threats
have exploited vulnerability, and how many were controlled due to the fact that the
information is simply not available for threats and vulnerabilities that may be known
now, but were not known in the past. As a result, quantitative risk analysis would take
so long that it would never catch up with the technological, sociological and business
changes that continually affect the company.

6. Conclusion
In this article, we show as simply as possible that the ALE model used in security risk
assessment, although is presented in the literature as quantitative, it is in fact
qualitative being influenced by bias. We worked with three assumptions which aimed
to prove the following:
(1) even if they are defined as “quantitative”, most of the models are, in fact,
“qualitative”;
(2) in practice, it is almost impossible to use objective probabilities for security risk
assessment; and
(3) within uncertain systems such as information systems, risk assessment is more
about perception.

A model that cannot be tested experimentally is not a scientific model. In fact, the
probability used in ISRM is an empirical probability or estimator of a probability because it
estimates probabilities from experience and observation. But in that case, probabilities must
be carefully determined on the basis of adequate statistical samples which in most cases do
not exist. Risk assessment actually involves decision-making. But decisions are made using
alternatives and alternatives need distinctions.

References
Aven, T. (2012), “The risk concept-historical and recent development trends”, Reliability Engineering
and System Safety, Vol. 99, pp. 33-44.
Bojanc, R. and Blazic, B.J. (2008), “Towards a standard approach for quantifying an ICT security
investment”, Computer Standards & Interfaces, Vol. 30 No. 4.
ENISA (2012), “Introduction to return on security investment helping CERTs assessing the cost of (lack
of) security”, available at: www.enisa.europa.eu/activities/cert/other-work/introduction-to-
return-on-security-investment/at_download/fullReport
FIPS (1974), Archived FIPS Publications, available at: http://csrc.nist.gov/publications/PubsFIPSArch.html
Gibson, D. (2014), Managing Risk in Information Systems, Jones & Bartlett Publishers, Burlington, MA. Running the
Godin, D. (2013), “You’re infected – if you want to see your data again, pay us $300 in Bitcoins, Ars risk IT
Technica”, available at: http://arstechnica.com/security/2013/10/youre-infected-if-you-want-to-
see-your-data-again-pay-us-300-in-bitcoins/
Kahenman, D. (2011), Thinking, Fast and Slow, Macmillan, London.
Keynes, J.M. (1921), “A treatise on probability”, available at: http://archive.org/stream/
treatiseonprobab007528mbp#page/n9/mode/2up 353
Korn, J. and Takats, A. (2013), Propagation of Uncertainty in Socio-Economic Systems, Cited in K. Ellis,
Amanda.
Landoll, D.J. and Landoll, D. (2005), The Security Risk Assessment Handbook: A Complete Guide for
Performing Security Risk Assessments, CRC Press, Boca Raton, FL.
Mano, P. (2013), Official (ISC)2 Guide to the CSSLP CBK, 2nd ed., CRC Press, Boca Raton, FL.
Munteanu, A., Fotache, D. and Dospinescvu, O. (2008), “Information systems security risk assessment:
harmonization with international accounting standards” International Conference on
Computational Intelligence For Modelling Control & Automation, Vienna.
Paul, R.J. (2008), “Measuring research quality: the united kingdom government’s research
assessment exercise”, European Journal of Information Systems, Vol. 7, pp. 324-329,
doi: 10.1057/ejis.2008.31
Popper, K.R. (1980), The Logic of Scientific Discovery, Routledge & Kegan Paul, London.
Sendi, S.A., Barzegar, R.A. and Cheriet, M. (2016), “Taxonomy of information security risk assessment
(ISRA)”, Computer & Security, Vol. 57, pp. 14-30.
Slovic, P. (2011), The Perception of Risk, cited in Kahneman D – Thinking Fast and Slow, Farrar, Straus
and Giroux, New York, NY.
Tan, D. (2002), Quantitative Risk Analysis Step-By-Step, SANS Institute, available at: http://web.iiit.ac.
in/bezawada/audit/CS/QualitativeRiskAnalysis.pdf
Taylor, R.G. (2015), “Potential problems with information security risk assessments”, Information
Security Journal: A Global Perspective, Vol. 24 Nos 4/6, pp. 177-184, doi: 10.1080/
19393555.2015.1092620.
Tversky, A. and Kahneman, D. (1974), “Judgment under uncertainty: Heuristics and biases”, Science,
New Series, Vol. 185, No. 4157.
Verendel, V. (2009), “Quantified security is a weak hypothesis: a critical survey of results and
assumptions”, NSPW ‘09 Proceedings of the 2009 Workshop on New Security Paradigms
Workshop, ACM, New York, NY.

Further reading
Gerber, M. and von Solms, R. (2005), “Management of risk in the information age”, Computers &
Security, Vol. 24 No. 1, pp. 16-30.
Gregory, J., Mears-Young, B.R., Ragsdel, G. and Ellis, K. (2013), Critical Issues in Systems Theory and
Practice, Springer Science & Business Media, 29 June.
Hole, K.J. and Netland, L.-H. (2010), “Toward risk assessment of large-impact and rare events”, IEEE
Security and Privacy, Vol. 8 No. 3.
Johnson, M.E. and Pfleeger, S.L. (2011), “Addressing information risk in turbulent times”, IEEE Security
and Privacy, Vol. 9 No. 1.
Karabacak, V. and Sogukpinar, I. (2005), “ISRAM: information security risk analysis method”,
Computers & Security, Vol. 24 No. 2.
Rhee, H., Ryu, Y.U. and Kim, C. (2012), “Unrealistic optimism on information security, management”,
Computers & Security, Vol. 31 No. 2.
ICS Slovic, P., Finucane, M., Peters, E. and MacGrego, D.G. (2004), “Risk as analysis and risk as feelings:
some thoughts about affect, reason, risk, and rationality”, Risk Analysis, Vol. 24 No. 2.
25,3
Solic, K., Ocevcicb, H. and Golub, M. (2004), “The information systems’ security level assessment model
based on an ontology and evidential reasoning approach”, Computers & Security, Vol. 55,
pp. 100-112.
Tsiakis, T. and Stephanides, G. (2005), “The economic approach of information security”, Computers &
Security, Vol. 24 No. 2.
354
Von Hayek, F.A. (2014), “Prize lecture: the pretense of knowledge”, available at: www.nobelprize.org/
nobel_prizes/economic-sciences/laureates/1974/hayek-lecture.html (3 February 2016).
Webb, J., Ahmad, A., Maynard, S.B. and Shanks, G. (2014), “A situation awareness model for
information security risk management”, Computers & Security, Vol. 44, pp. 1-15.

Corresponding author
Adrian Munteanu can be contacted at: adrian.munteanu@feaa.uaic.ro

For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com
Computers & Security (2004) 23, 213e228

www.elsevier.com/locate/cose

The effect of intrusion detection


management methods on the return
on investment
Charles Iheagwara)

Information Technology Security Department, Una Telecom, Inc., 4640 Forbes Boulevard,
Suite 200, Lanham, MD 20706, United States

Received 9 May 2003; revised 15 September 2003; accepted 24 September 2003

KEYWORDS Abstract This paper examines how implementation methods, management


Intrusion detection methods, and Intrusion Detection System (IDS) policy affect Return on Investment
systems; (ROI). The paper will seek to demonstrate the value associated with a well thought
Return on out implementation and effective lifecycle management of IDS technology and
investment; will culminate in a case study with a number crunching exercise to calculate the
Cost-effectiveness; ROI for an IDS deployment by a hypothetical financial company named UTVE, Inc.
Costebenefit analysis on risk.
The paper also discusses general IDS types and expands on the impact that the
logical location of a company’s critical networked assets could have on the risk
equations. To this end, the Cascading Threat Multiplier (CTM) is introduced to ex-
pand on the Single Loss Expectancy (SLE) equation. Also, implementation and man-
agement costs based on various support profiles and commonly accepted risk
equations are reviewed. Finally, a formula for calculating ROI for security, other-
wise commonly known as Return on Security Investment (ROSI) is devised.
ª 2004 Elsevier Ltd. All rights reserved.

Introduction IDS allows not only for the detection of attacks


explicitly addressed by other security components
An IDS is a security system that monitors computer (such as firewalls and service wrappers), but also
systems and network traffic and analyzes that attempts to provide notification of new attacks
traffic for possible hostile attacks originating from unforeseen by other components. Intrusion detec-
outside the organization and also for system misuse tion systems also provide forensic information that
or attacks originating from inside the organization. potentially allows organizations to discover the
By providing information to site administration, an origins of an attack.
Currently there are two basic approaches to in-
trusion detection. The first approach is to define
) Tel.: D1-301-459-7674; fax: D1-202-659-2810. and characterize correct static form and/or
E-mail address: iheagwarac@aol.com. acceptable dynamic behavior of the system and

0167-4048/$ - see front matter ª 2004 Elsevier Ltd. All rights reserved.
doi:10.1016/j.cose.2003.09.006
214 C. Iheagwara

then to detect abnormal behavior by defining sta- when expressed in economic terms. This is more
tistical relations. This is called anomaly detection. so because when choosing a security product, com-
It relies on being able to define the desired form or panies tend to justify their investments based on
behavior of the system and then to distinguish both economic returns and technical performance.
between that definition and undesirable form or In the selection of an IDS product, performance
anomalous behavior. While the boundary between is measured using such factors as scalability, avail-
acceptable and anomalous forms of stored code ability, ROI and the total cost of the system
and data can frequently be precisely defined, the relative to the price of the system the IDS is pro-
boundary between acceptable and anomalous tecting, just to mention a few.
behavior is much more difficult to define. A positive ROI of an IDS is dependent upon an or-
The second approach, called misuse detection, ganization’s deployment strategy and how well the
involves characterizing known ways to penetrate successful implementation and management of the
a system, usually described as a pattern, and then technology helps the organization achieve the tac-
monitoring for the pattern by defining rule-based tical and strategic objectives it has established.
relations. The pattern may be a static bit string, For organizations interested in quantifying the
for example, a specific virus bit string insertion. IDS’s value prior to deploying it, their investment
Alternatively, the pattern may describe a suspect decision will hinge on their ability to demonstrate
set or sequence of events. a positive ROI. The ROI has traditionally been diffi-
Intrusion detection systems have been built to cult to quantify for network security devices, in
explore both approaches: anomaly detection and part because it is difficult to calculate risk accu-
misuse detection. In some cases, they are com- rately due to the subjectivity involved with its
bined in a complementary way in a single intrusion quantification. Also, business-relevant statistics
detector. There is a consensus in the community regarding security incidents are not always avail-
that both approaches continue to have value. able for consideration in analyzing risk.
As a matter of practical reality, organizations In considering an implementation of an IDS
evaluate the effectiveness of the IDS implementa- technology, a positive ROI can be understood by
tions from both technical and economic perspec- analyzing the difference between Annual Loss Ex-
tives. Thus, the overall evaluation of any IDS pectancy (ALE) without IDS deployment and the
implementation is based on a wide range of crite- ALE with IDS deployment, adjusted for technology
ria especially when a choice has to be made as to and management costs. The ultimate initial goal,
what is the right IDS product. then, should be to prove that the value proposition
Practically speaking, a very important but often (re: a benefit in the form of a quantifiable reduc-
neglected facet of intrusion detection is its cost- tion in ALE) in implementing and effectively man-
effectiveness, or costebenefit trade-off. An edu- aging the IDS technology is greater than the
cated decision to deploy a security mechanism implementation and management costs associated
such as IDS is often motivated by the needs of se- with deploying the IDS technology.
curity risk management. The objective of IDS is The insights gained from previous research stud-
therefore to provide protection to the information ies that describe proven techniques to implement
assets that are at risk and have value to an organi- the IDS technology could be helpful. However, it is
zation. An IDS needs to be cost-effective because important to note that there are no known research
it should cost no more than the expected level of studies on the ROI of IDS. Related works border on
loss from intrusions. This requires that an IDS IDS performance and cost models, none of which
should consider the trade-off among cost factors, integrated or established a link between the tech-
which at the minimum should include develop- nical, operational and cost/economic factors that
ment cost, the cost of damage caused by an intru- serves as a gauge for justifying IDS deployment.
sion, the cost of manual or automatic response The related works are fundamental studies on
to an intrusion, and the operational cost, which IDS performance (Iheagwara and Blyth, 2002;
measures constraints on time and computing Iheagwara et al., 2003; Richards, 1999) that treat
resources. the relationship between deployment techniques
It should also be noted that it is not always nec- and attack system variables and the performance
essary to justify the cost of an organization’s IDS of the IDS; and models on costebenefit/sensitive
deployment because the implementation might analysis (Irvine et al., 1999; Lee et al., 1999) for
be undertaken as part of a standard due care. intrusion detection deployment.
The performance of IDS for many organiza- Richards (1999) evaluates the functional and
tions is not just measured in the ability of the IDS performance capabilities of the industries’ lead-
to capture or prevent attacks but on its value ing commercial type IDS. In the areas tested, the
Effects of IDS on ROI 215

performance of the IDS was rated based on their Current IDS implementation
distinctive features, which were characterized
into different performance indexes. The research In enterprise systems, IDS implementation requires
work represented a new direction for IDSs in that deployment of the IDS either at the computer
it moved the focus away from scientific con- system that is the putative target or placement on
cepts research to performance evaluation of the a network level where traffic can be evaluated or
industries’ best products. However, the study where information aggregated from various hosts
was limited to a small proto design isolated to can give insight in coordinated attack scenarios.
a non-switched network, which did not reveal the Hence it is important to maximize the imple-
impact of packet switching on the accuracy and mentation through effective deployment techni-
ability to capture attack packets in their entirety. ques. Ptacek and Newsham (1999) and Iheagwara
Iheagwara and Blyth (2002) expand this effort to and Blyth (2002) conduct studies to evaluate the
an evaluation study of the effect of deployment effect of deployment techniques on the perfor-
techniques on IDS performance in switched and mance of the IDS. The studies demonstrate that
distributed system. They demonstrated that moni- the IDS can be very effective if optimally deployed
toring techniques could play an important role in or it could just be another waste for the company
determining the effectiveness of the IDS in a if improperly managed. Since the IDS effective-
switched and distributed network. ness in detecting intrusions depends as much on
Iheagwara et al. (2003) in a comparative exper- the deployment technique, a significant change in
imental evaluation study of intrusion detection the approach to the implementation of intrusion
system performance in a gigabit environment ex- detection is needed for improvements.
amine the system benefits of using a single Gigabit Of interests are IDS product implementation
IDS sensor instead of multiple Megabit sensors for technology and the architecture that has so often
a wide range of defined system attacks, network been used to evaluate their effectiveness. The
traffic characteristics, and for their contexts of next two sections discuss the technologies and
operational concepts and deployment techniques. evaluation architectures.
As mentioned above, costebenefit model and
analysis studies of IDS deployments are relatively
few. Lee et al. (1999) study the problem of build- Technologies
ing cost-sensitive intrusion detection models. For
intrusion detection, Irvine et al. (1999) define Intrusion detection is an overlay of two separate
auditing of network control functions in intermedi- and different technologies: Network IDS (NIDS)
ate nodes, and rule-based network intrusion sys- and Host-based IDS (HIDS) systems. The primary
tems in the total subnet as the mechanisms. advantage of NIDS is that it can watch the whole
They also discuss the costs of those security serv- network or any subsets of the network from one
ices and mechanisms. location. Therefore, NIDS can detect probes, scans,
In contrast to the above, the focus of this paper and malicious and anomalous activity across the
is the return of investment of IDS products and an whole network. These systems can also serve to
examination of how implementation methods, man- identify general traffic patterns for a network as
agement methods, and IDS policy affect the ROI. well as aid in troubleshooting network problems.
The paper will seek to demonstrate the value asso- When enlisting auto-response mechanisms, NIDS
ciated with a well thought out implementation and can protect independent hosts or the whole net-
effective lifecycle management of IDS technology work from intruders. NIDS does, however, have
and will culminate with a number crunching exer- several inherent weaknesses. These weaknesses
cise to calculate the ROI for an IDS deployment by are its susceptibility to generate false alarms, as
a hypothetical brick and mortar wholesale hard- well as its inability to detect certain attacks called
ware supply company named UTVE, Inc. on risk. false negatives. NIDS also is not able to understand
The rest of the paper is organized as follows. host specific processes or protect from unautho-
The current IDS implementation is discussed in rized physical access. HIDS technology overcomes
the following section. Then in the next sections, many of these problems. However, HIDS techno-
the management and costs structures, the Cascad- logy does not have the benefits of watching the
ing Multiplier Effect (CTM) and a discussion on the whole network to identify patterns like NIDS does.
effects of proactive and reactive management A recommended combination of host and network
techniques are given. This is followed by a case intrusion detection systems, in which an NIDS
study on the ROI and finally a conclusion section is placed at the network border and an HIDS is
is presented. deployed on critical servers such as databases,
216 C. Iheagwara

Web services and essential file servers, is the best makes a determination on whether there is an
way to significantly reduce risk. attack. The extent and scope of accomplishing
the above roles is a gauge of the effectiveness of
the IDS and that is why the IDS performance is
Implementation architecture evaluated based on the ability of the processing
engine to effectively filter and reassemble packets
Generally speaking, most (host- and network- to any given network throughput.
based) intrusion detection systems have common The quantitative architecture includes attack
architectures, meaning that most host/network set detection, configuration alert triggering, log-
systems work as agents reporting to a central con- ging and reporting facilities modules that define
sole. The present implementation architecture is the ability of the IDS to accurately characterize
built on the concepts of the qualitative and quan- the operational setup of the environment and cus-
titative operational functionality of the IDS. In tomizable utilities.
Fig. 1 the IDS architecture is presented into the The qualitative architecture includes a module
quantitative and qualitative evaluation modules. that defines the product usability effectiveness of
In the quantitative evaluation module the main the IDS based on certain usability features such as
componentsddetection engine, filters, alerting the ease of user interface (ease of use, ease of con-
and configuration facilitiesdare used for func- figuration, ease of filter customization); integra-
tional and performance testing of the IDS. The IDS tion and interoperability with operating systems
detection engine’s (sensor’s) job is to watch the and existing network infrastructure; product ma-
network and detect attacks, a role that is per- turity; company focus and price.
formed by the packet-processing engine. To do The associated management and costs struc-
this, the sensor looks at every packet on the net- tures of IDS implementation are presented in the
work it is watching. The busier the network, the following section.
more packets there are to watch. If the sensor can-
not keep up, it will start to miss (or drop) packets.
In the case an attack spans multiple packets, the
sensor holds the packets, assembles them and
Management and costs structures

Deployment cost
Qualitative

Product Usability

User Interface
The associated cost of HIDS deployments can vary
O/S vs Appliance
Product Maturity
depending on vendor and software versions. A good
Company Focus
Price
baseline is that agents can cost between $500 and
$2000 each and consoles may cost in the $3000e
$5000 (Kevin and Kinn, 2002) range. This does not
include OS, hardware or maintenance costs. Net-
work intrusion detection systems can be deployed
Functional testing
as stand-alone hosts with a possible management
Attack Set Detection Facility
Configuration Facility
interface or distributed sensors and management
Alerting & Logging Facility
Reporting Facility
console. Generally speaking, commercially avail-
Architecture
able sensors run in the $5000e$20,000 range (Kevin
Quantitative

and Kinn, 2002) depending on vendor, bandwidth


and functional capabilities. Management consoles
can be free or can cost several thousand dollars
depending on the vendor. This does not necessarily
Performance Testing
include hardware or back-end databases. IDS usage
Filter Efficiency
Packet Reassembly
requires human interaction at the end point be-
Packet Throughput
cause the IDS will generate pertinent information
and data, but this serves no purpose without sub-
sequent examination of the data. This will cer-
tainly require allocating a skilled staff for IDS
management, log analysis, etc. If this is not the
Network Wire case, then the investment will fail to pay off.
The total cost of an IDS deployment depends on
Figure 1 The standard IDS architecture. implementation costs combined with the costs for
Effects of IDS on ROI 217

Table 1 Cost of individual components


Expense Value
Network IDS $10,000
Host IDS $1000
NIDS/HIDS management station $5000 (for some products)
Maintenance 15% of the cost of IDS
MSSP network IDS management per year $24,000 ($2K per month)
MSSP host IDS management per year $6000 ($500 per agent per month)
Engineer cost $75,000 ($60,000 salary plus $15K benefits & administration)
Group manager cost $100,000 ($80,000 salary plus $20K benefits & administration)

managing the technology. Some standard imple- support is very strong relative to internal 24 !
mentation and management methods common to 7 ! 365 multi-shift support. In larger IDS deploy-
IDS deployments include using a Managed Security ments, the cost differential between internal
Services Provider (MSSP), utilizing a single in-house (highly skilled) multi-shift coverage and MSSP cov-
employee or technician, or enabling 24 ! 7 ! 365 erage diminishes due to economies of scale on the
multi-shift coverage in-house with a skilled techni- internal multi-shift coverage side. Single support
cal staff. Of course the size of the organization and coverage is not a realistic option to consider
its associated IT budget (or lack thereof) factor in when contemplating a deployment of 30 security
to how the IDS technology will be deployed and devices. Also, this cost model does not take into
managed. The generalized cost structure used for account proprietary tools development necessary
discussions in this paper is shown in Table 1. to manage several different types of technology
(if that were the case) effectively.

Comparison of aggregate costs of different


implementation schemes The concept of cascading threat
multiplier
Based on this generalized cost structure in Table 1,
let us now consider the aggregate costs of three The main concepts of risk management and re-
different IDS deployments. Tables 2 and 3 repre- lated equations are given in Appendix 3. In the
sent implementation (purchase) costs combined world of new technologies, the interplay of tech-
with life cycle management costs over a three- nological processes, policies and risk management
year period. The three scenarios include man- introduces complexities into risk management and
agement by a single skilled in-house technician, requires the development and introduction of a
management in which there is five shifts of skilled new conceptdCascading Threat Multiplier (CTM)
technicians providing 24 ! 7 ! 365 coverage, and used to accurately calculate the ROI for any
management provided by an MSSP. It is very impor- acquired or developed technology.
tant to understand that full-service MSSPs will Consequently, the introduction of a new con-
provide 24 ! 7 ! 365 coverage just like the multi- cept, Cascading Threat Multiplier (CTM), into the
shift internal coverage provides. For completeness, mix will help in any analytical discussion that helps
there will be a review of two different IDS deploy- in distilling a meaningful ROI calculation in order
ments (one small and one medium) and the cost to determine the effectiveness of deploying IDS
structure of implementing and managing them. technology into a given network.
From the numbers it is evident that in smaller The Cascading Threat Multiplier (CTM) is a mul-
IDS deployments the value proposition of MSSP tiplying factor that will be included into our

Table 2 Implementation and management cost of one network IDS and two host IDS
Single support 24 ! 7 ! 365 Multi-shift support MSSP support
Technology cost ($) 24,650 24,650 24,650
Management cost ($) 225,000 1,425,000 108,000
Total cost ($) 249,650 1,449,650 132,650
Average cost per year ($) 83,217 483,217 44,217
Average cost per device per year ($) 27,739 161,072 14,739
218 C. Iheagwara

Table 3 Implementation and management cost of 15 network IDS and 15 host IDS
Single support 24 ! 7 ! 365 Multi-shift support MSSP support
Technology cost ($) N/A 268,250 268,250
Management cost ($) N/A 1,425,000 1,350,000
Total cost ($) N/A 1,693,000 1,618,250
Average cost per year ($) N/A 564,417 539,417
Average cost per device per year ($) N/A 18,814 17,981

expanded definition of Single Loss Expectancy asset is on a network that has access to the rest
(SLE). CTM is somewhat subjective and is intro- of the network, the Secondary Exposure Factor
duced mainly for the purpose of adding a little (EFs) will be very high. Examples of this would in-
more ‘‘flavor’’ to SLE. CTM factors in the impor- clude hosts that offer some public services but are
tance of other critical assets tied (re: networked) terminated within the internal network or hosts
to the specific asset being analyzed in the SLE cal- that have valid Secure Shell (SSH) keys to all hosts.
culation. It also coaxes risk analysts to think in SSH is a protocol used to provide strong authenti-
broader terms and to look at the bigger picture cation and secure communications over unsecured
when considering the risks associated with the channels.
compromise of a given asset. The formula for CTM It is important to consider what assets are easily
is as follows: (or even not so easily) accessible from a specific
networked asset once that asset is compromised.
Cascading Threat Multiplier ðCTMÞ When a given asset is compromised and used as
¼ 1CððUEA ! EFsÞ=AVÞ a staging point for attacks on other assets inside
and outside a company’s network, it could have
In this formula, Underlying Exposed Assets (UEA) is potentially devastating consequences for the
measured in dollars. These are the assets that are organization. If an attack is staged from the com-
now exposed due to the compromise of a specific promised asset to another asset outside the orga-
asset. Asset Value (AV) is identical to the calcula- nization, even when the owner was not directly
tion described elsewhere in this paper. Exposure involved in the malicious activity, they can and
Factor (EFs) represents secondary exposure factor probably will be held accountable. One can envision
and is related to the percentage loss on the UEAs. the UEA factor of SLE representing some portion of
Secondary Exposure Factor (EFs) is very similar to a trusted business partner’s assets. It is easy to
Exposure Factor (EF), as described in the stan- imagine the negative business impact the offending
dard equation in Appendix 3, with a few minute organization would encounter if one of its compro-
differences. mised assets were used as a staging ground to com-
It should be noted that the CTM formula is rele- promise and damage its business partner’s assets.
vant only to the threat of unauthorized disclosure, What is the risk, quantified in dollars, of not
as would typically be of primary concern to a mili- considering a business partner’s assets when per-
tary or diplomatic agency. It does not address in- forming a valuation exercise on your company’s
terruption in service, as would for instance be assets, ones that, if compromised, may enable
very important for an Internet merchant. Likewise, access to more sensitive data and systems? The
it does not address data corruption, as would for CTM concept provides the analytical framework
instance be very important for a bank. to closely scrutinize the assets under an organiza-
The primary reason for introducing EFs is to fac- tion’s control, assign more comprehensive valua-
tor in the importance of an asset’s logical location tions to those assets, and to more accurately
within a network. For example, if the asset is a measure the impact that compromising of these
Web server that is in a true Demilitarized Zone assets could have on the organization.
(DMZ) and has no access to the network or to any As a practical example, it is assumed that a Web
other corporate servers, EFs would be low since server has been compromised and used by a mali-
it is unlikely that an attacker can use this device cious person to stage attacks on other networked
to further compromise the network. But if the as- assets containing critical data valued at 10 times
set is on the same broadcast domain as other serv- the amount (in dollars) of the data contained on
ers are (such as e-mail, DNS and FTP), or there is the compromised Web server. As the perpetrator
no access control between the asset and other hopscotches his way from asset to asset, penetrat-
servers, then EFs will be higher. Finally, if the ing deeper and deeper into the network, he may
Effects of IDS on ROI 219

finally gain access to critical data on a vulnerable Finally, the risk analysis calculations listed
asset deep inside the company’s network. The above can be tied to an accepted formula for cal-
CTM for the Web server would be calculated as fol- culating the ROI for a security product.
lows with a summation (re: best guess) that the
Secondary Exposure Factor (EFs) of the Underlying ROI ¼ recovery cost ðRÞ  ALE
Exposed Asset(s) (UEA) is 70%: CTM ¼ 1Cðð10 ! where ALE ¼ ðR  EÞCT, and E equals the savings
0:7Þ=1Þ ¼ 8. The CTM has increased the SLE for gained by preventing an attack and T equals the
the compromised Web server by a factor of 8. cost of a security product.
The white arrow originating from the compromised It should be noted that the use of the ALE ap-
Web server should be followed to better visualize proach in risk management is not suitable when
this concept as shown in Fig. 2. a worst-case analysis involving discussion of repu-
Tying the CTM concept back into the SLE calcu- tation risk and loss of goodwill is required.
lation, a new definition of Single Loss Expectancy Section ‘‘The effects of proactive vs. reactive
can be expressed as: management on risk’’ is a discussion on proactive
SLE ¼ EF ! AV ! CTM and reactive management methodology and a dem-
onstration on how this methodology affects analy-
A thorough risk management exercise should fac- sis of risk. This will set up the framework for the
tor in the CTM concept by executing a more com- calculation of IDS ROI and will culminate in a
prehensive valuation methodology that included case study in section ‘‘A case study on IDS ROI cal-
more subjective, intangible factors into their Asset culation modular approach’’ to demonstrate the
Value (AV) variable calculation. As mentioned efficacy of the new concept (CTM). Finally, all
above, goodwill (i.e. business and consumer loy- the numbers will be tied together to demonstrate
alty built on trust) and opportunity costs (i.e. the devised technique for calculating the ROI for
choosing not to consider the effect that a compro- UTVE deployment of one network-based IDS and
mised asset can have on other assets) are some- two host-based IDSs.
what analogous to the CTM concept when these
intangibles are factored into the Asset Valuation
(AV) used in the SLE calculation. The effects of proactive vs. reactive
The importance of capturing intangible value, management on risk
and understanding the risks associated with jeop-
ardizing the value, is one of the more challenging
Independent of implementation and management
aspects of risk and return analysis. Introducing
costs, the method in which the devices are man-
the CTM concept into the traditional SLE calcula-
aged can have a serious effect on the ROI. As a re-
tion will make the capture of the intangible as-
sult, the key question to answer is: is the system
pects of asset valuation a little less daunting of
going to be proactive or reactive as security events
a task.
are detected? Table 4 depicts the normal event
flow in each method. A proactive implementation
response is automated by the system while a reac-
tive implementation response is manually driven
with the help of enlisted personnel.
By examining the Annual Loss Expectancy (ALE ¼
ARO ! SLE, where SLE ¼ exposure factor ! asset
value ! cascading threat multiplier), the variables
that are affected by each of these two man-
agement methods can be determined. In a reactive
design, where personnel must be engaged to re-
spond to each event, the exposure factors (pri-
mary [EF] and secondary [EFs]) will be affected.
In a proactive design there will be similar benefits
to the exposure factors (re: a reduction) and, in
addition, the ARO will be influenced in a beneficial
way as well. It will be beneficial to use the concept
of primary and secondary mitigation windows
to demonstrate the impact of threat vs. time. As
Figure 2 Cascading threat multiplier. illustrated in Appendix 1, the primary mitigation
220 C. Iheagwara

Table 4 Proactive and reactive management methods


Method System actions Personnel actions Follow up information
Reactive Log / alert / respond Respond / analyze / eradicate Forensics and evidence
Proactive Respond / log / alert Analyze / eradicate if necessary Forensics and evidence

window affects ARO while the secondary mitiga- The most recent statistics (http://www.
tion window affects exposure factor and cascading silicondefense.com/software/acbm/speed_of_
threat multiplier. An effective way of impacting snort_03_16_2001.pdf) was used to determine how
ARO is through automated response. effective the device is in recognizing attacks. In
Auto-response can take many forms. On host- this test the worst NIDS detected 67 of 109 attacks
based IDS this is sometimes called shielding, where or 61.5%, while the best detected 94 of 109 attacks
a specific process is terminated. Network-based for an 86.2% detection rate. Even the worst case,
IDS generally employs TCP resets or shunning. the 61.5% detection rate, was out of the box
TCP resets effectively kill one specific session (http://www.nss.co.uk/Articles/IntrusionDetection.
based on suspicious activity, but it still allows other htm) and it was reported that it would not be dif-
activity from that same IP. Shunning, on the other ficult to improve this with some custom signatures
hand, changes firewall rules or router access lists and tuning.
and effectively denies all traffic from that host The above could be interpreted to mean that
for a specific period of time. In essence, shielding the worst IDS tested can still detect at least 61.5%
will protect a single host from one process, resets of attacks. Realistically that number should be
will protect a host from a specific session, and closer to 70% when a skilled engineer or technician
shunning will protect the entire network from a manages the device. The auto-response feature,
specific host for a pre-determined amount of time. when properly used, can be a very effective
The accuracy of automated response can vary method of reducing the ARO. This provides some
tremendously. This is dependent on the skill level general numbers, which can be plugged into the
of the engineers managing the devices. If the engi- equations for calculating a ROI for UTVE.
neers are moderately skilled then auto-response
will not be very effective, which may adversely af-
fect the ROI of the IDS deployment. This adverse A case study on IDS ROI calculation
effect may manifest itself in the form of a loss of
modular approach
productivity from network-related problems due
to improperly implemented auto-response, as well
as the additional fallout related to a false sense of The lack of established literature on a suitable
security throughout the company. management approach that can maximize the IDS
With skilled engineers managing the devices, ROI mandates the use of a case study approach
auto-response can be very accurate and effective. that permits the in-depth exploration of the bene-
The data generated over a period of 30 days from fits of performing an ROI analysis to maximize the
the network of NetSolve, Incorporated (Kevin and management techniques of intrusion detection
Kinn, 2002) will be used to illustrate the accuracy systems. From these, it is possible to glean some
of automated response. If Code Red and Nimda general concepts about IDS ROI and determine
activities are included, in 99.96% of the attacks, the viability of the management approach that will
where automated response was used to mitigate enhance the maximization. By developing the
the threat, the activity was malicious. Excluding examples, it is also hoped to develop a possible
large-scale worms, the attacks were malicious in method of reasoning about IDS ROI more generally.
95.8% of auto-response uses. Of the 4.2% of the The case study will be presented in the context of
traffic that was not malicious, not all of it was de- events and risk analysis in a hypothetical company
sirable. Some of this traffic was peer-to-peer pro- called UTVE, Inc.
grams, on-line gaming, chat and other undesirable
traffic that triggered alarms. The percentage of Framework for risk analysis and
traffic that was denied and business related was ROI computation
very small. It should be noted that many of these
devices provide numerous different techniques for In order to prepare for the IDS ROI computational
ensuring that very little, if any, legitimate traffic model, it will be useful to set up a hypothetical
is denied through the use of automated response. company called UTVE and through a case study
Effects of IDS on ROI 221

present the threat and incidence scenarios to cal- calculating the IDS ROI. First, the enterprise
culate the ROI based on the effective implementa- business, IT infrastructure, business relations and
tion and lifecycle management of HIDS and NIDS security practices are described. This is followed
technologies. There is the need to articulate a ho- with a discussion on the threats and attacks that
listic approach and, at the same time introduce compromised the security of the business. In this
some new concepts for analyzing risk. In the ana- case, a series of defined and designated attacks
lytical discussion leading up to the calculation of to compromise the system are mimicked. Each
the ROI, commonly accepted formulas and defini- attack exploits a specific vulnerability in the enter-
tions associated with asset valuation, exposure, prise network system. Next, the attacks, compro-
threat, vulnerability and loss expectancy will be mises and contributing factors are analyzed and
used. The Cascading Threat Multiplier (CTM), an the sources of the security breach delineated. Part
additional factor added to the mix, enables the of the analysis is the recommendation of the
expansion of the risk assessment widely accepted necessary safeguards to forestall future attacks
calculation for Single Loss Expectancy (SLE) where, and in this case deployment of intrusion detection
traditionally, SLE ¼ Exposure Factor ðEFÞ ! Asset systems. Based on the results of the analysis, the
Value ðAVÞ. Annual Loss Expectancy (ALE) is quantified, the ana-
In order to stress the importance of the intangi- lytical techniques that are pertinent to the IDS
ble considerations that will help to apply a holistic ROI, and their relevance in developing a model
approach for quantifying risk and calculating for calculating the ROI for IDS deployment in busi-
a meaningful ROI, the concepts of goodwill and ness settings are described. The study will culmi-
opportunity costs should be considered when per- nate into a discussion of the best management
forming valuation exercises on company assets. and effective techniques for deploying the IDS to
Although intangible factors inherently introduce maximize the ROI.
subjectivity into risk and return analysis, it is
nonetheless an important step to consider intan-
gibles before one can arrive at a more meaningful The UTVE enterprises
calculation of the ROI.
It is worth mentioning here that, in general, it UTVE’s remote offices (Toronto, Manchester, Not-
may be safe to assume that organizations would ting, Sony) are connected via private T1 lines
tend to undervalue certain data assets if they have to the corporate office with no Internet outlet.
not fully taken into account (or bothered to under- Employees of the firm who require access to com-
stand for that matter) how these assets relate to pany data while out of the office use the VPN over
the ‘‘big picture’’. It is simple human nature to take the Internet.
the path of least resistance when given a choice. In order to successfully conduct business with
But that is a very dangerous path to take for anyone UTVE enterprises, customers and business associ-
attempting to arrive at an accurate assessment of ates need to have consistently reliable telephone,
the value of data assets residing on their network. fax, e-mail, Internet, file, print and database ac-
Understanding the tangible costs and benefits of cess, whether in a remote office or in the corpo-
an asset is much easier than understanding, or rate office. UTVE sales associates must also have
even considering for that matter, the intangible the same system reliability and availability while
costs and benefits associated with that same asset. remotely accessing the corporate systems over
Clarifying this understanding is a challenge and the VPN (Virtual Private Network). Remote users
one that will be addressed throughout the rest of are primarily sales associates and trusted business
the paper as the IDS ROI is calculated in the case associates who connect to UTVE’s VPN over some
study for UTVE, Inc. sort of broadband technology. The remaining asso-
Ultimately, the framework is the use of hypo- ciates need VPN access while traveling, which is
thetical events and data derived from such events typically dial-up access.
to develop a process model for the computation of
IDS ROI. The threat events and the incidence anal-
ysis are given in the context of risk analysis. VPN attack and risk analysis

The UTVE network was compromised when a mali-


Methodology cious attacker gained access to UTVE’s data and
network through the VPN tunnel that was estab-
The methodology used in the case study takes lished with one of its business associates. Because
a pragmatic approach towards the issue of of this, UTVE disconnected its VPN connection with
222 C. Iheagwara

In Table 6, the SLE and ALE are presented for


Table 5 UTVE’s asset valuation
each incident. The ALE is generated by multiplying
Asset category Replacement cost ($) the SLE for the incident by the ARO of the threat.
Accounts payable 816,907.00 The overall ALE for a threat is the sum of the ALEs
Applications 50,000.00 for each of the associated incidents. This is shown
Cash accounts 500.00 as the total of the third column. The percentage of
Communications hardware 10,000.00 this total represented by the ALE for each incident
Communications software 4000.00 is indicated in the fourth column. Also shown for
Databases 15,000.00
each threat is a bar chart that provides a visual
Facilities 500,000.00
presentation of the relative magnitudes of the
Hardware 120,000.00
Office equipment 300,000.00 ALE for each incident. Pie charts are also provided
Personnel 5,000,000.00 to indicate the percentage of each threat ALE that
Procedures 530,000.00 is accounted for by each incident that is used in its
Security 100,000.00 calculation.
Supplies and consumable 80,000.00 The incidence class is the grouping of the vari-
System software 10,000.00 ous losses in the event of a threat materializing
according to their form. For example, the direct
loss is the loss from the associated asset and
the business associate (ACME) and temporarily modification is the loss in monetary value due to
halted any business transactions with ACME until modification of data, application, system, etc.
the issue is resolved. The concepts and definitions for ALE and SLE have
After an extensive network and operational been given in Appendix 3 and the plots are repre-
analysis of the incidence, several key operational sented in Figs. 3 and 4, respectively.
deficiencies were uncovered that revealed several
security flaws of the UTVE network. The descrip-
Risk mitigation
tion of the compromised assets and underlying
exposed assets is given in Appendix 2.
As a result of the risk analysis, the following miti-
The general replacement cost for UTVE’s asset
gation measures were recommended.
category is given in Table 5. It is assumed that
the valuation exercise was separately undertaken - Guidelines should be created for the use of
before the ROI calculations, which in itself, is not UTVE remote access facilities. Exampled
the best approach. The replacement cost is the access privileges are generally only granted
cost of acquiring an asset as itemized in the asset to managers, team leaders, associates respons-
category in the event of loss. ible for overnight/weekend support, and sales
To quantify and characterize the loss associated staff.
with the VPN attack, a quantitative risk analysis - Employees or those requiring remote access
was conducted. The analysis revealed that data in- to the network should have the approval of
tegrity was affected by the intrusion. The various senior management, as well as the operation
incidents that could be associated with the intru- department.
sion at UTVE enterprise are shown in Table 6. - There should only be one method of connecting
The ALE is based on an ARO of three (3.0). to the network from a remote location. The IT

Table 6 Losses from compromised data integrity with an ARO of three (3)
Incident class SLE ($) ALE ($) % Of total ALE
Direct loss, procedures 18,145 54,435 71.6
Direct loss, security 3596 10,788 14.2
Direct loss, applications 2058 6174 8.1
Modification, databases 513 1540 2.0
Direct loss, databases 427 1282 1.7
Direct loss, system software 416 1248 1.6
Direct loss, communications software 166 499 0.7
Modification, applications 12 37 0.0
Modification, system software 2 7 0.0
Modification, communications software 1 3 0.0
Effects of IDS on ROI 223

54,435
Direct, Procedures
10,788
Direct, Security
6,174
Direct, Applicatns
1,540
Modificatn, Databases
1,282
Direct, Databases
1,248
Direct, System S/W

5 10 15 20 25 30 35 40 45 50 (x 1,000)
Dollars

499
Direct, Comms S/W
37
Modificatn, Applicatns
5 10 15 20 25 30 35 40 45 (x 10)
Dollars

7
Modificatn, System S/W
3
Modificatn, Comms S/W

1 2 3 4 5 6 7
Dollars

Figure 3 Data integrity lossdALEs.

department should have the ability to turn Calculation of ROI


remote access off at any time.
- Proper authentication and data encryption Procedurally, once the Asset Valuations (AV and
mechanisms (e.g. VPNs) should be put in place. UEA), Exposure Factors (EF and EFs) have been cal-
- The scope and techniques of data encryption culated, the SLE and the ARO are then calculated.
should be expanded. In general, the ARO is computed based on the
- Also, part of the stringent security measure is analysis of the annual frequency of threats. A
the need for UTVE to implement IDS technology distinction has to be made on the two types of
to monitor the content of each connection. The AROdSite-specific rate ARO and National ARO.
recommendation is that a host-based IDS be Based on the analysis and assumptions in each
run on the Web servers and a network-based scenario, the computations for AV and UEA, EF
IDS run at the border. and EFs and ARO are shown in Appendix 2.
- Remote access use should be limited to only For each threat, an ARO is derived by analyzing
those needing it for business with technical and available national data. The derived ARO values
management safeguards. developed from the national data are not as appli-
cable as AROs developed with site-specific data.
In section ‘‘A case study on IDS ROI calculation Site-specific data are defined as information gath-
modular approach’’, the ROI is calculated using ered directly on or from the site itself such as
risk management data and IDS implementation those represented in Table 7. Historically recorded
costs (for both the single support and MSSP support data of previous threat occurrences, which can
scheme) at UTVE for the different IDS deployment generally be collected from any specific site are:
options discussed in section ‘‘Comparison of aggre- maintenance logs, documentation on system
gate costs of different implementation schemes’’. operations and system failures, air-conditioning
224 C. Iheagwara

18,145
Direct, Procedures
3,596
Direct, Security
2,058
Direct, Applicatns

25 50 75 100 125 150 175 (x 100)


Dollars

513
Modificatn, Databases
427
Direct, Databases
416
Direct, System S/W
166
Direct, Comms S/W
12
Modificatn, Applicatns

5 10 15 20 25 30 35 40 45 50 (x 10)
Dollars

2
Modificatn, System S/W
1
Modificatn, Comms S/W
1 2
Dollars

Figure 4 Data integrity lossdSLEs.

and power failure, and component mean-time be- In Appendix 3, each variable and risk equation
tween failure reports, etc. The following are the that was used in the ROI calculations for UTVE
guidelines to follow when determining the ARO IDS deployment is itemized. A review of the Ap-
value for a given threat: pendix shows how the traditional ROSI equation
has been tied back to the ALE containing the
 When possible, the ARO value for the given
CTM factor i.e.
threat is developed from site-specific/resident
data. This requires gathering site resident data
ROSI ¼ R  ALE;
as needed to calculate the mean and standard
deviation for any specific threat ARO. Examples where the commonly accepted
of threats that are best represented by site-
specific data are: air-conditioning failures, ALE ¼ ðR  EÞCT
power outages, operator errors, user input
errors, system crashes, and theft. is now replaced with
 When it is not practical to gather the site-
specific data required to calculate the ARO ALE ¼ ARO ! SLE;
value for a particular threat, the standard ARO
value can be used. and
An example of local site statistics on the rate of SLE ¼ AV ! EF ! CTM:
occurrence of known attacks is given in Table 7.
These are actual numbers based on network The support costs ($83,217/year for single sup-
attacks that NetSolve, Inc. manages (Ptacek and port coverage and $44,217/year for MSSP support
Newsham, 1999). coverage) used to calculate these ROIs were taken

Table 7 Average attack occurrences per network


Attack Per network attempts (April) Per year attack attempts Scenario
General Cmd.Exe 9492 113,904 1
Root.exe backdoor 1869 22,428 1
Ida overflow 105 1260 1
SSH attacks 2 2.0 2
DNS bind attacks 7 8.0 2
FTP attacks 3 4.0 2
Apache chunked 6 7.0 2
IOS HTTP unauthorised 3 36 3
Effects of IDS on ROI 225

from Table 3. The results of the calculations are In the final analysis, attainment of a better ROI
shown in Appendix 4. The use of auto-response depends on a good management practice especi-
scheme produces by far better ROI in both NIDS ally the use of highly skilled engineers or tech-
and HIDS deployments. A conservative estimate nicians who have a sound understanding of the
of 50% reduction in ARO is facilitated by the utili- technology including the inherent strengths and
zation of auto-response. Also, the 25% reduction weaknesses to manage the IDS technology. It is
in both exposure factors (EF and EFs) should also also a reasonable assumption that a single in-house
be considered a conservative estimate in the IDS engineer or technician would better support IDS
deployment with auto-response and prompt inci- deployment of one NIDS and two HIDSs. On the
dent response scheme. other hand, it will be ineffective to assume that
The benefits of a better IDS management are one person can support this highly dynamic tech-
reflected in the reductions in the values of the nology on a continual 24/7/365 basis with active
variables in the highlighted cells under ARO, EF auto-response and real-time incident response
and EFs in Appendix 4. The overall effect is visible for every security event. Multi-shift internal sup-
in the increase in the ROI values for the UTVE IDS port as well as Managed Security Service Provider
deployment for both the single in-house support (MSSP) support is the preferred ways of providing
and MSSP support schemes. definitive 24/7/365 support and real-time incident
Auto-response affects primary mitigation win- response.
dows, which has a direct impact on partially reduc-
ing the Annual Rate of Occurrence (ARO). This is
illustrated in the ROI Appendix 4, where a benefi- Conclusions
cial conservative reduction in ARO of 50% (high-
lighted in yellow (web version) in the ‘‘IDS The importance of using intrusion detection as
w/Auto-Response’’ rows for each of the three sce- a means of risk management has been pointed
narios) is attained. Incident response affects the out by several researchers. This work in ROI mod-
secondary mitigation window, which impacts Expo- eling for IDSs has benefited from the insightful
sure Factor (EF) and Secondary Exposure Factor analysis from real-world experiences demons-
(EFs), which in turn impacts the Cascading Threat trated in the case study and draws from research
Multiplier (CTM). This is also illustrated in the in intrusion detection systems using knowledge
ROI in Appendix 4, where a beneficial conserva- gained from security risk management.
tive reduction in EF and EFs of 25%, respectively The contributions made by this paper are in the
(highlighted in yellow (web version) in the ‘‘IDS development and introduction of a new conceptd
w/ Auto-Response & Incident Response’’ rows for Cascading Threat Multiplier (CTM) and the model
each of the three scenarios) is attained. framework used to accurately calculate the ROI
These reductions have positive effects on the for any acquired or deployed IDS technology.
IDS ROI. Once the aggregate annualized savings To effectively analyze and calculate the IDS
(ALE1eALE2 or ALE1eALE3) occurring from IDS de- ROI, there is the need to have a sound understand-
ployment equals the support costs associated with ing of the environment where the IDS is deployed
the deployment a positive ROI should materialize. including, at a minimum, the business prac-
In the case of UTVE, the two ROIs (ROI1 and ROI2) tice, and network architecture and asset values.
for each support profile are as follows: Equally, a good analysis of system vulnerabilities
 single support with IDS using auto-response and associated threats should be addressed
(ROI1) Z 4%; within the framework of a sound security policy
 single support with IDS using auto-response and and risk mitigation techniques.
incident response (ROI2) Z 36%; Finally, this paper has demonstrated that a
 MSSP support with IDS using auto-response positive IDS ROI is attainable with an effective
ðROI1Þ ¼ 81%; and deployment technique and optimal management
 MSSP support with IDS using auto-response and approach.
incident response ðROI2Þ ¼ 155%.
These ROIs are based on the aggregate annualized
savings from deploying and effectively managing Acknowledgements
the IDS technology and the resulting impact
the IDS technology could reasonably have on the I would like to acknowledge the following
combined effect of the three compromise scenari- people who have contributed to this work in one
os described above (Appendix 4). manner or another: Professor Andrew Blyth, School
226 C. Iheagwara

of Computing, University of Glamorgan, Wales, editorial assistance; Kevin Timm and David Kinn,
UK for his constructive suggestions; Professor Security Engineers at Netsolve, Inc., Austin,
Mukesh Singhal, Department of Computer Science, USA for providing some of the data in Table 7. To
University of Kentucky, Kentucky, USA for his these people, I am extremely grateful.

Appendix 1. Threat mitigation window

Appendix 2. Calculations for asset valuations, exposure factors and annual rate of
occurrence
Descriptions of
Considerations in assigning estimates for Asset Valuations (AV &
Compromised Asset
UEA), Exposure Factors (EF & EFs) and Annual Rate of AV EF UEA EFs ARO
Scenario (AV) and Underlying
Occurrence (ARO)
Exposed Assets (UEA)
Cost of lost productivity/revenue from downtime? Cost of
NT 4.0 Web server (AV);
One compromised underlying data/assets? Cost of rebuilding web server? $2,000 75 $20,000 75 3
NT Domain (UEA)
Potential cost of compromise of NT domain resources?
UNIX-based Web server
(AV); old internal dbase Cost of lost productivity and revenue from downtime? Cost of loss of
containing inventory trust or confidence of UTVE’s online customers? Cost of
Two $2,000 50 $50,000 50 2
data/pricing for compromised data and assets? Cost of rebuilding web server?
customers and suppliers Immediate cost of fulfilling current orders to satisfy customers?
(UEA)
Cost of supply interruption from primary supplier? Cost of loss of
trust or confidence of primary supplier? Potential cost of
compromised data? Cost for UTVE to replace ACME as a supplier?
Difference in credit terms for new supplier (compared to highly
Router; Primary supplier
Three favorable terms that UTVE currently enjoys with ACME)? $3,000 75 $200,000 50 1
UTVE’s network
Difference in pricing between normal (new) supplier and ACME's
pricing? Potential cost of litigation if ACME determines UTVE
employee is at fault? Cost of UTVE compromise? Potential cost of
liability for attacks directed at other non-partner networks?
Effects of IDS on ROI 227

Appendix 3. ROI variables and risk equations

Variable Formula or expression


Asset Value AV = hardware + comm. software + proprietary software + data
Exposure Factor EF is the estimation of the exposure of the initial compromised asset
UEA is the estimation of the $ value of the assets behind the
Underlying Exposed Assets
compromised initial asset
Secondary Exposure Factor EFs is the estimation of the exposure of the UEAs
Cascading Threat Multiplier CTM = 1 + ((UEA x EFs) / AV)
Single Loss Expectancy SLE = EF x AV x CTM
ARO is estimated number, based on available industry statistics or
Annual Rate of Occurrence
experience
Annual Loss Expectancy without IDS ALE1 = SLE x ARO
ALE2 = conservative 50 reduction of ARO when IDS is managed
Annual Loss Expectancy with IDS using auto-response
skillfully with auto-response
Annual Loss Expectancy with IDS using auto-response & incident ALE3 = conservative 25 reduction of EF & EFs when I DS is
response managed skillfully with auto-response and incident response
Annual Cost (T) of IDS Technology and Mgmt T
Annual Recovery Cost ('R) from Intrusions without IDS R = ALE1
Annual Dollar Savings (E) gained by stopping intrusions with IDS E = ALE1 - (ALE2 or ALE3)
Traditional Return on Security Investment (ROSI) equation ROSI = R - ALE, where ALE = (R-E) + T
UTVE IDS ROI with auto-response ROI1 = ALE1 - ((ALE1 - (ALE1 - ALE2)) + T)
UTVE IDS ROI with auto-response & incident response ROI2 = ALE1 - ((ALE1 - (ALE1 - ALE3)) + T)

Appendix 4. IDS ROI for different management schemes


228 C. Iheagwara

References Richards K. Network based intrusion detection: a review of


technologies. Comput Secur 1999;18:671e82.

Available from: http://www.nss.co.uk/Articles/IntrusionDetection.


htm.
Available from: http://www.silicondefense.com/software/acbm/ Charles Iheagwara is the Director of IT Security Services at Una
speed_of_snort_03_16_2001.pdf. Telecom, Inc. in Lanham, Maryland, USA. In this position,
Iheagwara C, Blyth A. Evaluation of the performance of IDS he oversees all IT security projects for private and govern-
systems in a switched and distributed environment. Comput ment clients. He is also one of the Principal Investigators for
Netw 2002;39:93e112. corporate research, which are geared towards studies on
Iheagwara C, Blyth A, Singhal M. A comparative experimental enterprise-wide security solutions. These include translating
evaluation study of intrusion detection system performance IT security academic and laboratory research into business
in a gigabit environment. J Comput Secur January 2003;11(1). solutions.
Irvine C, Levin T. Toward a taxonomy and costing method for
security metrics. In: Proceedings of the Annual Computer He is also an External Research Fellow at the School of Com-
Security Applications Conference, Phoenix, AZ; December puting, University of Glamorgan, Wales, where he specializes in
1999. Intrusion Detection Systems research and development. He has
Kevin T, Kinn D. CTM. Technical Report, Netsolve, Inc., Austin, published numerous technical and scientific papers in the field
USA; 2002. in the refereed journals, conference proceedings and research
Lee W, Fan W, Miller M, Stolfo S, Zadok E. Toward cost-sensitive newsletters.
modeling for intrusion detection and response. North
Carolina State University; 1999. He is also an Adjunct Professor of Computer Science in the
Ptacek TH, Newsham TN. Insertion, evasion, and denial of School of Business Studies at Bowie State University, in Maryland,
service: eluding network intrusion detection. Secure Net- USA where he teaches graduate-level information assurance
works, Inc.; 1999. courses.
Abstract
A Framework This paper takes stock from several studies on

for Information Information Technology outsourcing risk. A definition


of risk is offered, and an illustration from five case
studies is used to show how risk can be managed.
Technology Results show that an active risk management
approach can reduce risk exposure substantially while
Outsourcing Risk enabling the organizations to still reap the benefits
associated with outsourcing.

Management ACM Categories: K.6

Keywords: Outsourcing, Risk, Risk Management,


Information System, Contract Design
Benoit A. Aubert
HEC Montréal Introduction
CIRANO “You’ll never have all the information you need to
make a decision – if you did, it would be a foregone
conclusion, not a decision” (Mahoney, 1988, p.156).
Michel Patry Information Technology (IT) outsourcing entails a
HEC Montréal number of decisions regarding a variety of issues, be
CIRANO they the choice of the activities to outsource and of
those to keep in-house, the selection of a service
provider, or the identification of the most appropriate
way to manage an outsourcing contract. As stated by
Suzanne Rivard Mahoney (1988), managers never have all the
HEC Montréal information they need to decide on these issues. Yet,
CIRANO research on IT outsourcing has provided a number of
models and tools that are aimed at decreasing the
level of uncertainty or that help in dealing with the
uncertainty in which the decision is necessarily
clouded. Such is the case for instance of the studies
that examine the characteristics of the activities that
are good candidates for outsourcing and of those that
should be kept in-house (Aubert et al., 1996); that
compare outsourcing to insourcing, evaluating if, and
how, in-house services could be reorganized to
provide firms with benefits similar to those of
outsourcing (Lacity & Hirschheim, 1995); or that
attempt to identify those characteristics of a vendor-
client partnership that will be conducive to outsourcing
success (Lee & Kim, 1999).
Acknowledgement
This paper develops this strand of research in its
The authors are grateful to the anonymous reviewers adoption of a risk management perspective to
for their thorough comments and most useful outsourcing decisions, and in its proposition of a risk
suggestions. This research was supported by Fonds assessment framework. As shown through a series of
FCAR (Canada). An earlier (abridged) version of this case studies, the risk assessment framework
paper was published in Managing IT Outsourcing Risk: contributes to reducing decision-making uncertainty in
Lessons Learned, in Information Systems that its use can help better anticipate – and sometimes
Outsourcing: Enduring Themes, Emergent Patterns alleviate – problems potentially associated with IT
and Future Directions, Hirschheim, R.A., Heinzl, A., outsourcing, and to select contract mechanisms that
and Dibbern, J. (eds.), Springer-Verlag, Berlin, are most appropriate for the types of activities to be
Heidelberg, New York, 2002:155-176. outsourced.

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 9
We begin by drawing upon past research to provide a useless to focus on the consequence itself since it is
synthesis of the main lessons of IT outsourcing risk irreversible. Odds of occurrence are the key element.
and risk management. The paper first presents various Data is used to determine which factors can influence
definitions of risk found in the literature, discusses the those probabilities (heredity, smoking habits,
definition adopted here, and applies it to the context of cholesterol level, etc.). In its definition of sentinel
IT outsourcing risk. Drawing from the IT outsourcing, events (occurrence involving death or serious injury),
the Transaction Cost, and the Agency Theory the Joint Commission on the Accreditation of
literature, it then presents a framework of IT Healthcare Organizations uses “risk” as “the chance of
outsourcing risk exposure. Finally, using five case serious adverse outcome” (Kobs, 1998). Life insurance
studies, key elements pertaining to outsourcing risk adopts this approach and uses mortality tables to
and risk management are discussed. estimate probabilities. In this context, a “good risk” will
be a person with a low probability of dying within a
given period (and hence, for the insurance company, a
Risk Defined low probability of having to pay a compensation) and a
Risk and risk management have been studied in a “bad risk” would be a person with a high probability of
variety of domains, such as Insurance, Economics, dying within the period.
Management, Medicine, Operations Research, and
Engineering. Each field addresses risk in a fashion Risk as Variance
relevant to its object of analysis, hence, adopts a Finance adopts a different perspective of risk, where
particular perspective. Since it is essential that the risk is equated to the variance of the distribution of
conceptualizations of risk and of risk management outcomes. The extent of the variability in results
adopted in a given study be consistent, authors ought (whether positive or negative) is the measure of risk.
to clearly state their perspective. This section reviews Risk is defined here as the volatility of a portfolio’s
the main conceptualizations of risk and of risk value (Levine, 2000). Risk management means
management found in various fields, and then arbitrating between risk and returns. For a given rate
presents the perspective we opted for in our research of return, managers will prefer lower volatility but
on IT outsourcing risk and risk management. would be likely to tolerate higher volatility if the
Subsequently, the key concepts from Transaction Cost expected return was thought to be superior. Portfolio
Theory and Agency Theory relevant to the managers therefore aim to build a portfolio that is on
assessment of IT outsourcing risk are introduced, the efficient frontier, meaning it has “the highest
followed by a presentation of the risk assessment expected return for a given level of risk, and the lowest
framework we propose. level of risk for a given expected return” (Schirripa &
Tecotzky, 2000, p. 30).
Risk as an Undesirable Event
In some situations, risk is equated to a possible Risk as Expected Loss
negative event. Levin and Schneider (1997) define Other fields, such as casualty insurance, adopt a
risks as “… events that, if they occur, represent a perspective of risk as expected loss. They define risk
material threat to an entity’s fortune” (p. 38). Using this as the product of two functions: a loss function and a
definition, risks are the multiple undesirable events probability function. Car insurance is a good example.
that may occur. Applied in a management context, the In the eventuality of an accident, there is a loss
“entity” would be the organization. Given this function that represents the extent of the damages to
perspective, risks can be managed using insurance, the car, which can range from very little damage to the
therefore compensating the entity if the event occurs; total loss of the car. There is also a probability function
they can also be managed using contingency that represents the odds that an incident will occur.
planning, thus providing a path to follow if an The expected loss (risk) is the product of these two
undesirable event occurs. This definition of risk is functions (Bowers et al., 1986).
analogous to the concept of risk as a possible
reduction of utility discussed by Arrow (1983). Endogenous and Exogenous Risk
Risk as a Probability Function Another important distinction in risk analysis is the
notion of endogenous versus exogenous risk.
Some fields, instead of focusing on negative events, Exogenous risks are risks over which we have no
are primarily concerned with the probability of control and which are not affected by our actions.
occurrence of an event. For example, medicine often Earthquakes or hurricanes are good examples of
focuses solely on the probability of a disease’s exogenous risks. Although we have some control over
occurrence (e.g., heart attack), since the negative the extent of damage by selecting construction
consequence is death in many cases. It would be standards, we have no control over the occurrence of

10 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
such natural events. Endogenous risks, on the other Rather, to them, the potential positive outcomes
hand, are risks that are dependent on our actions. A constitute the attractiveness of an alternative, while
car accident is an example of risk where a strong risk is associated with its negative outcomes. That is,
portion is endogenous. While a driver has no control risk is perceived as a “danger or hazard”. March and
over other drivers (the exogenous portion), the Shapira also emphasize the fact that, to managers, the
probability of an accident is strongly influenced by the magnitude of the loss due to a negative outcome is
driver’s behavior and ability (endogenous). The driver salient.
also controls part of the loss function, by deciding to
In order to take into account these two aspects of the
drive an expensive car or a cheap car. This could
managerial perspective, we adopt the notion of risk
explain why there is always a deductible amount with
exposure, which is defined as a function of the
car insurance, to ensure that the driver will behave in a
probability of a negative outcome and the importance
way that will minimize the endogenous portion of the
of the loss due to the occurrence of this outcome:
risk. By being made responsible for a portion of the
RE = Σi P(UOi) * L(UOi)
damages, the driver is enticed to act with caution.
Risk management tools take into account whether risk where P(UOi) is the probability of an undesirable
is endogenous or exogenous. In finance, for example, outcome i, and L(UOi) the loss due to the undesirable
risk is considered exogenous. The methods used to outcome i (Boehm, 1991; Teece et al., 1994).
manage risk are concerned with diversification, Therefore, we consider simultaneously the potential
insurance, and allocation of assets. There is no direct losses associated with an outsourcing contract and
action that managers can take to reduce the the probability function of such losses.
probability of a given event. In engineering or
medicine, a portion of the risk is always endogenous. While, theoretically, risk exposure can be computed
Risk management takes this into account. Patients are and a value of risk established in dollar terms, in
practice it is more useful to map the risk exposure on a
informed of the portion they control and are proposed
two-dimensional plane (the usefulness of this
healthier diets and lifestyles; employees are provided
representation will be discussed in the risk
with security guidelines and actions are taken to
reduce directly the probability of undesirable management sub-section). Therefore, the loss
consequences. associated with a given undesirable outcome is
evaluated and the likelihood of realization of the
IT Outsourcing Risk Exposure outcome is estimated. Instead of multiplying the two, a
point is mapped on a plane where the likelihood of
In their study of managerial perspectives on risk and realization of an undesirable outcome is measured
risk taking, March and Shapira (1987) posit that along the x-axis and the magnitude of the loss if that
managers do not equate the risk of an alternative with outcome occurs is measured along the y-axis. Figure 1
the variance of the probability distribution of possible illustrates the mapping of risk exposure for three
outcomes, and they do not treat uncertainty about undesirable outcomes (UO).
positive outcomes as an important aspect of risk.
Magnitude of loss due to UOi

Risk exposure for UO2

Medium Risk High Risk


Risk exposure for UO3

Risk exposure for UO1


Low Risk Medium Risk

Probability of undesirable outcome i (UOi)


Figure 1. Risk Exposure Map

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 11
To evaluate the risk exposure for a given outsourcing reducing the probability of occurrence of such an
contract, it is therefore essential to identify the array of event, by reducing the level of the risk factors (for
potential undesirable outcomes that could occur with example, by carefully selecting team members). While
respect to the outsourcing arrangement; the the definition of risk is not explicit about probability
magnitude of the losses incurred should each of the distribution, these probabilities (taking the form of
undesirable outcomes materialize; as well as the factors) are taken into account when the risk
probability of occurrence of such outcomes. In any evaluation is performed. Therefore, risk factors can be
situation, several undesirable outcomes may occur. seen as the drivers of undesirable outcomes, and
The magnitude of loss due to a given undesirable P(UOi) = f(RFj) where the probability of an undesirable
outcome can be approximated either via quantitative outcome i is a function of its influencing risk factor j.
analysis (for instance, by evaluating the sales lost due
Once this list of factors is drawn and assessed,
to disruption of service to customers) or via qualitative
managers try to reduce the probability of occurrence of
assessment of the organizational impact of each
an undesirable outcome by reducing the level of the
negative outcome (by using Likert scales to assess the
risk factors. For example, when a person quits
importance of the impact of the undesirable outcome).
smoking, the person knows she is reducing the
To evaluate the risk exposure for a given outsourcing probability of illness, even if the exact impact on the
contract, it is therefore essential to identify the array of probability is unknown to her.
potential undesirable outcomes that could occur with
The risk assessment framework proposed here relies
respect to the outsourcing arrangement; the
mostly on Transaction Cost and Agency theories.
magnitude of the losses incurred should each of the
These economic theories tackle directly the problems
undesirable outcomes materialize; as well as the
related to contracting and provide both a roadmap to
probability of occurrence of such outcomes.
potential undesirable outcomes and their
In any situation, several undesirable outcomes may corresponding drivers.
occur. The magnitude of loss due to a given
undesirable outcome can be approximated either via Reference Theories
quantitative analysis (for instance, by evaluating the
sales lost due to disruption of service to customers) or In order to determine the list of undesirable events and
via qualitative assessment of the organizational impact their associated risk factors, a first group of elements
of each negative outcome (by using Likert scales to was deducted from Transaction Cost and Agency
assess the importance of the impact of the undesirable Theories. These undesirable events and
outcome). corresponding factors are presented in the following
section. All these elements are later summarized in
While in certain circumstances, the probability of Table 1 (Components of IT outsourcing risk exposure).
occurrence of an undesirable outcome can be In addition to this first list extracted from theory, a few
estimated on the basis of past performance items were identified by reviewing other (scientific and
characteristics of the object under study (Linerooth- practitioner) sources describing outsourcing
Bayer & Wahlstrom, 1991), in several areas, consequences. These are also included in Table 1.
probabilities are often difficult, if not impossible to
assess on the basis of past performance (Barki et al., Agency Theory and Transaction Cost Theory
1993). Consequently, several risk assessment
methods adopt the approach of approximating the Fundamentally, outsourcing is a contract in which a
probability of undesirable outcomes by identifying and client relies on a supplier for a given service, instead of
assessing factors that influence their occurrence depending on internal provision. With outsourcing, the
(Anderson & Narasimhan, 1979; Boehm, 1991; Barki client relies on the market rather than on employment
et al., 1993). In a software development context, for contracts. In a principal-agent framework, the client is
instance, Barki et al. have identified such factors, the principal, while the supplier is the agent,
which belong to five broad categories: technological performing a series of tasks for the principal.
newness, application size, software development Agency Theory. The presence of private information
team's lack of expertise, application complexity, and lies at the root of opportunism. Any information
organizational environment. The degree to which each possessed by one party that is not verifiable by the
factor is present in a software project will contribute to other party is “private.” When private information and
the increased probability of the occurrence of an conflicting interests are joined, agency problems
undesirable outcome (here, project failure). Once this develop. In the absence of complete contracts, the
list is drawn, risk management methods try to parties will try to reduce the importance of agency
simultaneously reduce the loss related to the problems by a better alignment of objectives or by
undesirable event itself (such as penalties reducing the asymmetry (Holmstrom, 1979).
compensating for delays in the system delivery) or by

12 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
Undesirable outcomes
Unexpected transition Factors leading to outcome:
and management costs Lack of experience and expertise of the client with the activity (Earl, 1996; Lacity et
(Cross, 1995; Earl, 1996; al., 1995, Sappington, 1991)
Nelson et al., 1996) Lack of experience of the client with outsourcing (Earl, 1996)
Uncertainty about the legal environment
Switching costs (including Factors leading to outcome:
lock-in, repatriation, and Asset specificity (Williamson, 1985)
transfer to another Small number of suppliers (Nam et al., 1996)
supplier) (O’Leary, 1990) Scope
Interdependence of activities (Langlois & Robertson, 1992)
Costly contractual Factors leading to outcome:
amendments (Earl, 1996) Uncertainty (Alchian & Demsetz, 1972; Barzel, 1982)
Technological discontinuity (Lacity et al., 1995)
Task complexity
Disputes and litigation Factors leading to outcome:
(Aubert et al., 1999a; Measurement problems (Alchian & Demsetz, 1972; Barzel, 1982)
Lacity & Hirschheim, Lack of experience and expertise of the client and/or of the supplier with outsourcing
1993) contracts (Earl, 1996; Lacity et al., 1995)
Uncertainty about the legal environment
Poor cultural fit
Service debasement Factors leading to outcome:
(Lacity & Hirschheim, Interdependence of activities (Aubert et al., 1997; Langlois & Robertson, 1992)
1993) Lack of experience and expertise of the supplier with the activity (Earl, 1996)
Supplier size (Earl, 1996)
Supplier financial instability (Earl, 1996 )
Measurement problems (Alchian & Demsetz, 1972; Barzel, 1982)
Task complexity
Cost escalation (Lacity Factors leading to outcome:
and Hirschheim, 1993; Lack of experience and expertise of the client with contract management (Earl,
Lacity et al., 1995) 1996; Lacity et al., 1995)
Measurement problems (Alchian & Demsetz, 1972; Barzel, 1982)
Lack of experience and expertise of the supplier with the activity (Earl, 1996)
Loss of organizational Factors leading to outcome:
competency (Dorn, 1989; Scope of the activities
Earl, 1996; Lacity et al., Proximity to the core competency (Prahalad & Hamel, 1990)
1995) Interdependence of activities (Langlois & Robertson, 1992)
Hidden service costs Factors leading to outcome:
(Lacity & Hirschheim, Complexity of the activities
1993) Measurement problems (Alchian & Demsetz, 1972)
Uncertainty (Barzel, 1982)

Table 1. Components of IT outsourcing risk exposure


(Adapted from Table 1 – Aubert et al., 2001, p. 2)
Agency Theory is concerned with the client’s problem principal to observe an agent’s behavior at no cost.
with choosing an agent (the supplier in our case), and Since the client cannot tell, and since the supplier
motivating and coordinating the agent’s decisions and knows this, the supplier can always blame poor
behavior with those of the organization, under the performance on circumstances beyond its control.
constraint of information asymmetry. In addition, the Cheating, shirking, free-riding, cost padding, exploiting
agent is presumed risk adverse, thus unwilling to be a partner, or simply being negligent are everyday
rewarded solely on his performance (Nilakant & Rao, instances of moral hazard. Adverse selection will
1994). develop when the principal cannot observe the
characteristics of the agent. Failure to deal adequately
Agency Theory generally distinguishes three main
with adverse selection will make it very difficult for the
villains: moral hazard, adverse selection, and
client to choose the right supplier. The last potentially
imperfect commitment (Aubert et al., 2003). Moral
damaging manifestation of opportunism is imperfect
hazard stems from the fact that it is impossible for a
commitment. For instance, clients and suppliers may

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 13
be tempted to renege on their promises and specificity of the assets, the uncertainty surrounding
commitments, arguing unforeseen events like changes the transaction, and the frequency of the transaction.
in requirements (costly contractual amendments)
Asset specificity refers to the degree to which an asset
(Sappington, 1991).
can be redeployed without sacrificing its productive
The client wants the supplier to perform its tasks as value if the contract is to be interrupted or prematurely
required. However, writing and enforcing complete terminated (Williamson, 1985). Because the “next best
contracts is utopia. The agency costs include the cost use” value of a specific asset is much lower, the
of writing and enforcing contractual agreements and investor would lose part of its investment if the
the residual loss resulting from inadequate transaction were not completed. The specificity of an
coordination or motivation. Agency Theory tackles the asset creates a lock-in situation where a party could
important issue of designing efficient contractual extract a quasi-rent from the contracting party by
agreements (Eisenhardt, 1989). threatening to withdraw from the transaction (Teece,
1986).
One key negative outcome suggested by Agency
Theory is the management costs associated with the Asset specificity is therefore a risk factor. When an
contract. These management costs will be caused by outsourcing contract is signed, investments in specific
information asymmetry. The asymmetry (lack of assets will tie the client and the supplier together. This
knowledge from the client) can be estimated by creates a situation where one or very few suppliers
evaluating the experience and expertise of the client can provide service to the client. Small number
with the activities to be outsourced. Experienced and bargaining is also a risk factor. Contracts of a large
knowledgeable clients will suffer from less asymmetry scope would also increase lock-in. The larger the
than naïve ones. scope of the contract, the more difficult is the transfer
to a second supplier. These factors can lead to high
Agency theory also suggests that measurement
switching costs because of the lock-in created
problems will facilitate the manifestation of moral
between the two parties.
hazard. This would lead to cost escalation and hidden
service costs. High uncertainty and complexity would A concept akin to asset specificity is the level of
increase the probability of the appearance of hidden interdependency of an activity with the other activities
service costs, since they make cost assessment more of the firm. The more interdependent, the higher the
difficult. switching costs (Langlois & Robertson, 1992). The
interdependent character of the activities is also linked
Agency Theory is concerned with the monitoring
to the quality of service.
abilities of the principal. These abilities can be
acquired through experience. As described by The usual manner in which asset specificity
Sappington (1991), principals can compare multiple constraints are resolved calls for the use of long term
contracts and can learn about performance over time. contracts (Joskow, 1987). However, uncertainty may
This suggests that lack of experience with outsourcing preclude the implementation of long term contracts at
would be a risk factor. It would increase the likelihood a reasonable cost. For a market to be efficient, parties
of cost escalation and management costs. must be able to predict with enough certainty the
activities to be performed in a contract and to measure
Transaction Cost theory. According to Transaction
the value of the elements exchanged. This is often
Cost theory, the market and the internal organization
proven false. Transactions are conducted with a
of a firm are seen as alternative mechanisms to
certain level of uncertainty and are subject to
regulate a transaction (Coase, 1937). A party will
measurement problems (Barzel, 1982). Uncertainty
select the mechanism which costs less: total costs
may preclude contract agreement since the parties
include production and transaction costs. One of the
cannot predict what will be needed in the future and,
advantages of using the market is that it often provides
consequently, cannot write a contract (Williamson,
for lower production costs because of economies of
1985; 1989). This is a serious limit to the use of long
scale and scope. On the other hand, using the market
term contracts (in order to solve asset specificity
entails certain transaction costs: finding the
constraints).
appropriate prices, inquiring about the quality of the
other party, negotiating, establishing guarantees and Uncertainty of the future needs is a risk factor. An
bonds, etc. These activities are required because indicator of the level of uncertainty is the complexity of
Transaction Cost Theory recognizes that humans the task (Aubert et al., 1996). High uncertainty will
have bounded rationality and are opportunistic force parties to renegotiate the contract when changes
(Williamson, 1985). This implies that parties will occur. These costly changes are an undesirable
conclude bargains with imperfect information and will outcome.
try to take advantage of any asymmetry. The potential
magnitude of the transaction costs will depend on: the

14 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
According to Perry (1989), a contract is incomplete if in TCT (which AT offers) provides a better institutional
“it fails to specify performance obligations for the model (see Hennart, 1993 for an instructive
parties in all states of nature, or fails to specify the discussion). Taken together, AT and TCT suggest
nature of the performance itself” (p. 221). Most, if not seven undesirable outcomes (summarized in Table 1,
all, IT contracts will have some level of incompleteness along with their associated risk factors). First, both
since nobody can foresee all possible states of nature. theories recognize that no contract is complete or
Reducing the level of incompleteness implies refining perfect and that unforeseen events or naïve actions
the contract. This is costly. Crocker and Masten (1991) can lead to unexpected transition and management
even show that there is an optimal level of costs. Both also recognize that incompleteness can
incompleteness for a contract. eventually lead to costly contractual amendments.
TCT specifically addresses the problem of specific
Therefore, the presence of measurement problems is
assets, small number bargaining, and the associated
another risk factor. These measurement problems can
switching costs problems. TCT is also giving special
lead to potential haggling, disputes, and eventually
attention to disputes and litigation. Haggling, legally or
litigation. Severe measurement problems might also
administratively, increases transaction costs. Both
prevent contractual agreement when it becomes
theories moved away from the optimization premises
impossible to know if performance is attributable to
of classical economics and recognize that there will be
one party's action or to externalities (Alchian &
inefficiencies in production, translated as service
Demsetz, 1972). When it is impossible to measure the
debasement in the context of outsourcing. Agency
marginal contributions of each party to a transaction,
Theory, with its special attention on moral hazard, puts
trying to compensate the workers according to their
more emphasis on cost escalation problems.
individual productivity is futile (Alchian & Demsetz,
1972). This problem is often settled, in an incomplete
Other Elements
contract, by substituting measurement of effort to
measurement of output (Cheung, 1983). This suggests Agency and Transaction Cost Theories enabled the
that measurement problems can also lead to inferior identification of seven undesirable outcomes:
performance (service debasement). unexpected transition and management costs,
switching costs, costly contractual amendments,
When transaction costs are too high, it might be
disputes and litigation, service debasement, cost
cheaper to purchase the residual rights over the
escalation, and hidden service costs. Their main
activities in exchange for a salary (Grossman & Hart,
associated factors were also identified.
1986). This contract enables one party (the employer)
to choose, in the future, the actions appropriate to the Once these factors and undesirable outcomes were
context (Simon, 1991). established, the outsourcing literature describing
outsourcing outcomes and risks was reviewed to
Finally, in some circumstances, a firm will prefer to
identify the elements that might not be included in the
bear the cost of the risk associated with specific
Transaction Cost and Agency Theories. This review
investment or uncertainty rather than to invest in order
also provided support for the stylized negative
to internalize a single transaction. Internal organization
outcomes suggested by Agency and Transaction Cost
is efficient only for frequent transactions (Williamson,
Theories. Moreover, it helped us identify new elements
1985). The design and setting-up costs of a
and new links between factors and outcomes.
governance structure are fixed costs: only recurring or
particularly significant transactions will make bearing One negative outcome that is not discussed in
these costs economically sound. Transaction Cost and Agency Theories is the loss of
organizational competency. Outsourcing often means
Undesirable outcomes. Agency (AT) and
departing from knowledge possessed by the
Transaction Cost (TCT) Theories share similar
employees transferred to the supplier. Essential skills
theoretical foundations. To explain contractual
can be lost if outsourced activities are too close to the
arrangements, they both include elements like
core business of the firm (Prahalad & Hamel, 1990).
imperfect measurement, bounded rationality,
Because of a reliance on external provision, the
opportunism, and cheating and shirking behaviors.
organization will stop nurturing some skills. When
However, they differ in several ways. For instance,
these are close to the core competency of the
TCT explicitly considers the role of irreversible
organization, such loss might threaten future
investments in the transaction. AT considers the risk
organization action (Roy & Aubert, 2000). The larger
aversion of the agents. TCT is more concerned with
the scope of the contract, and the more
the institutional arrangement (market or firm) while AT
interdependent these activities are with the other
looks at the governance mechanisms used,
activities of the firm, the higher the likelihood of a loss
independently of the institutional setting. The
of organizational competency.
extension and refinement of incentive considerations

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 15
The review of literature unearthed other factors (not for outsourcing. Case 3’s unique contribution is the
predicted by Transaction Cost and Agency Theories) proximity to core competency of the activity supported
associated with the undesirable outcomes. Quality of by the application the development and maintenance
service is reputed to be highly dependent on the of which were considered for outsourcing. Case 4 was
supplier's characteristics (size, stability and expertise) selected because the firm was known to have a long
(Earl, 1996). The lack of experience and expertise of IT outsourcing experience, including that of contract
the supplier is also associated with cost escalation. renewal. It was thought that studying successive
contracts would add to our understanding of risk
There are also specific types of uncertainty, distinct
assessment and risk management. Finally, Case 5’s
from requirement uncertainty. Uncertainty about the
distinguishing feature was the fact that this large firm
legal environment can lead to unexpected
had concluded contracts with three different suppliers.
management costs (Cross, 1995), while technological
Finding out how these contracts were designed and
discontinuities can entail costly contractual
how this dual sourcing arrangement played a role in
amendments (Lacity & Hirschheim, 1993). Finally, the
outsourcing risk management was the main motivation
lack of experience of the partners with outsourcing and
for selecting the case.
the lack of cultural fit are associated with higher
chances of disputes. Data Collection
We group the risk factors into three main categories: In addition to presenting the components of IT
those pertaining to the principal, to the agent, and to outsourcing risk, the case studies were aimed at
the transaction itself. The undesirable outcomes, with improving our understanding of how risk was
their associated factors, are summarized in Table 1. managed. Risk management is defined here as the
The outcomes are presented with the factors having use of different mechanisms to help reduce the level of
the strongest effect on the likelihood of each outcome. risk exposure. A given mechanism can either reduce
This does not imply that other factors cannot have an the losses associated with the undesirable outcomes,
influence on a given outcome; it simply means that lower the expected probability of occurrence of such
these are the critical factors. Negative outcomes outcomes, or achieve both at the same time.
suggested in Agency and Transaction Cost Theories Consequently, rich data were sought not only on the
receive support through examples in the outsourcing components of IT outsourcing risk, but also on the risk
literature. management mechanisms that had been used in each
case.
Research Method
Several data collection means were used. For Cases
In order to explore the usefulness of the proposed risk 1 and 5, each organization's leading information
assessment framework, illustrated in Figure 1 and systems’ manager acted as the key informant. Data
detailed in Table 1, data from five case studies was collection was done on site, starting with semi-
used. structured interviews with the IS manager. These
Case Selection interviews enabled the researchers to gather
substantial information and get an overview of the
Case selection was based on site availability and situation. Available documentation was collected.
potential contribution to the research project. Cases After the first analysis of this information was
were therefore selected for the new insights they could completed, a series of written questions were sent to
provide on risk assessment and risk management and each respondent to follow-up on queries raised
there is no claim that they are representative of an during the analysis of the first interview. Respondents
industry or organizations in general. All cases describe returned their written answers when it was
large outsourcing contracts with significant risks for the convenient for them. Later, phone interviews were
client organization. Some were successful, some were conducted with the respondents to gather any details
not. Some were well-managed, some were not. Case that were still missing in the case descriptions.
1 was self-selected. Indeed, the IS manager of an Complete case descriptions were written and were
insurance company whom we approached in the submitted to the IS managers for approval. Data
context of a large-scale survey invited us to interview collection for the other three cases involved several
him in order to obtain a detailed and complete respondents, from over half a dozen in Cases 2 and
description of an outsourcing deal gone awry. Since it 3 to approximately 15 in Case 4. These respondents
is seldom that failures can be documented, the case were managers who had been involved in the
was considered as a precious source of data for our decision-making process, and managers in charge of
study of IT outsourcing risk management. Case 2’s overseeing the contracts. In Case 4, some user
peculiarity is that it offered the possibility to compare, managers were also interviewed. Numerous
within the same firm, two different system documents were made available to the researchers,
development projects that were considered candidates

16 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
including requests for proposals, supplier proposals, managing different activities in different ways. The
contracts and addenda, materials from presentations, second case shows the importance of using a formal
and audit reports. In these three cases, the contents measure of risk and of explicitly assessing risk. The
of write-ups were also approved by some – or all – of third illustrates how risk evaluation can influence the
the interviewees. supplier selection process. The fourth exemplifies
how risk management is the result of a series of
Data Analysis compromises, since reducing the impact of a risk
factor often increases the impact of another one.
On the basis of these data, each researcher
Finally, the fifth case is an epitome of risk
independently rated each risk factor and the
management; it is an example in which the client had
magnitude of the potential loss associated with each
the resources and the knowledge to negotiate a
undesirable outcome by rating the risk factors and
contract with an impressive array of risk management
the undesirable outcomes listed in Table 1. The
mechanisms.
impact of each outcome was assessed on a 1 (very
low) to 7 (very high) scale. The probability of Each case is presented as follows. First a short
occurrence of each outcome was estimated (on a 1 description of the case’s salient features is provided,
to 7 scale) by first evaluating each of the risk factors referring to the case risk exposure map. As well,
associated with the outcome and then by averaging when such mechanisms were applied, the effect of
the values of these factors. As well, risk management risk management mechanisms is illustrated. This is
mechanisms were identified, where they had been done using arrows that originate at the exposure
used, and their impact on either the probability or the level, for a given undesirable outcome prior to the
severity of the occurrence of an undesirable outcome use of the risk management mechanism; and end at
was assessed. While perceptual measures were the residual level of risk exposure.
used, the researchers attempted to substantiate each
score with facts, taken from the case descriptions. As Case 1: Risk Is Complex and Should Be Managed
well, items from formal measures that had been Accordingly
validated in a former study were used to identify the
INS is a North-American insurance company. After
dimensions of each factor that had to be taken into
evaluating that its information systems needed a
account in the assessment (e.g., measures from
major upgrade, INS turned to the market to find a
Aubert et al., 1996b). The individual assessments
supplier for its needs. It selected a software package
were then discussed among the researchers, and a
produced by VND, a software vendor not established
consensus was reached. A risk exposure map,
in North America, who also offered to manage INS’
similar to that of Figure 1, was then drawn. This risk
data centre. VND saw this contract as an opportunity
exposure map summarizes the results of the
to enter the North American market. Both parties
assessment, since it plots, for each undesirable
anticipated a win-win association. The contract
outcome, the estimated probability (i.e., the mean
involved few performance measures for IS
value of the risk factors related to this undesirable
operations. In the case of the software package,
outcome, as per Table 1) and the magnitude of loss,
which required some tailoring, a general agreement
would the undesirable outcome occur. A detailed
was reached in which 80% of the functionalities in a
write-up of this final assessment was done as in
complete online system would be delivered for a fixed
Cases 2, 3 and 4. In the first two cases, the write-up
price, and the remaining functionalities would be
was formally presented to most of the managers who
delivered later for a price to be negotiated. The
had participated in the interviews. In Case 4, it was
contract was thus largely incomplete and made room
submitted to three of these managers. In all three
from the outset for further negotiations between the
cases, managers agreed with the researchers’
parties.
assessments. Only in Case 2 were the managers
surprised that the degree of risk they had perceived In this case, the undesirable outcome with the largest
through “gut feeling” was different from what was magnitude of potential loss was that of reduced
presented to them. As will be discussed in the results quality. Since all activities in the insurance business
section, even in this case, managers agreed with the rely on information systems, any deficiency would be
individual assessments that had been made of each very costly. As one respondent said: “If we cannot
risk factor and each potential loss. configure a product on the system, we cannot sell it.”
The three other severe potential losses were cost
escalation, contractual amendments, and lock-in. The
The Five Cases insurance market is highly competitive and increases
The cases illustrate five different dimensions of IT in IT costs would harm the competitiveness of INS’
outsourcing risk management. The first case insurance rates.
illustrates problems and stresses the importance of

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 17
Figure 2. Insurance Company

As shown in Figure 2, the intensity of risk exposure evaluation of the contract reveals some key
for the software tailoring component of the contract elements. First, potential problems can arise even
was different from that of the risk exposure when suppliers and clients behave in good faith at
associated with the day-to-day operations (although the time of contract signature. Risk analysis could
they were managed through the same contract and have helped anticipate some of these problems.
the same governance mechanisms). For software Second, such an analysis would have revealed that
tailoring, the risk exposure was generally higher, operations and software tailoring had very different
mainly due to the value of the risk factors. Client and characteristics and should have been managed
supplier had limited experience with outsourcing. differently. More measures should have been defined
Activities were not unduly complex, but software for the IS operations. As for software tailoring, richer
tailoring was subject to severe measurement and more flexible mechanisms, like the exchange of
problems (the exact elements to be delivered were information and of employees, for example, should
vaguely defined at best). Operations were easier to have been established in order to reduce information
measure, and they were also subject to less asymmetry. Finally it might have been advantageous
uncertainty than software tailoring. While there were to outsource each group of activities to different
numerous suppliers willing to operate the data centre, suppliers to introduce a third party opinion on each
only a few had the knowledge to develop and supplier’s work, thus reducing the risk of lock-in and
implement the software. instilling a measure of competition.
Outsourcing of the data centre did not provide the
anticipated benefits. No gains were realized on the Case 2: Managers’ Attitudes Toward Risk
operation, and performance deteriorated. VND began
The second case is that of GVDL, a large insurance
to haggle over the definitions of the service level
company. Two system development outsourcing
clauses. On the software tailoring side, client and
decisions and the resulting contracts were analyzed
supplier began arguing constantly about what
(Aubert et al., 1999a). The first project was the Y2K
constituted the basic system and the extra
conversion of the legacy system. The effort required
functionalities, what was to be delivered for the fixed
for migrating all the systems through the millennium
price, and what might be developed for an additional
was estimated at more than 25,000 person-days. The
compensation. After long negotiations, the contract
second project was called the Application
was finally terminated.
development partnership project. The client had
Insight. In this case, no formal risk analysis was decided to stop awarding contracts to many different
done before signing the agreement. Ex post suppliers and had chosen to select a single (or

18 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
Figure 3. GVDL

maybe a few) application partners that would invest insurance, thus reducing the monetary value of the
time and resources in understanding the company and consequences. Finally, in view of cost escalation, the
its needs. The results of the risk assessment for these client secured guaranteed rates and the parties agreed
two outsourcing decisions are presented in Figure 3 ex ante on the evaluation method and relied on a
(arrows represent reduction of risk level through detailed inventory of the various components,
contract mechanisms). languages, platforms, size, complexity, testing
environments, interactions with other systems, etc. As
From Figure 3, it seems quite clear that project 1
shown by the arrows drawn in Figure 3, the potential
(Y2K) was less risky than project 2. While many
losses associated with lock-in, service debasement
potential undesirable outcomes (service debasement,
and cost escalation were accordingly substantially
lock-in and cost escalation) had high values for project
reduced.
1, the probabilities were generally very low, with the
exception of cost escalation, which was fairly probable. In the second contract, the risk exposure stemming
Project 2 was riskier. Items 3, 6, and 7 have mid-range from lock-in was reduced in two ways. The first one
values on the factor axis (probabilities), and the losses implied multiple sourcing: three suppliers were
associated with both lock-in and contractual selected to work concurrently, which seriously curtails
amendments would be very high. As a result, the the probability of the client being locked in.
organization took several measures to lower the risk Renegotiation problems and costly contractual
exposure associated with both contracts. amendments were handled through the separation of
assignments in addenda. This enabled the partners to
In the case of the Y2K contract, protection against
actually modify their contract without costly
lock-in was sought through sequential contracting. By
renegotiations. It is an ongoing modification process
splitting the work to be done in many sequential steps,
that is included in the contract (sequential contracting).
the client ties the duration of the contract to verifiable
Residual risk was thus greatly reduced.
performance on the one hand, and leaves open the
Among the interesting facts found in this case are
possibility of walking out of the relationship if things
managers’ perceptions. As shown in Figure 3, it is
were to take a bad turn on the other hand. In the case
quite clear that project 2 was riskier than project 1,
of service debasement, the main mechanism used by
whether one considers risk exposure before or after
the client to reduce the probability of occurrence was
risk management mechanisms are introduced. This
the inclusion of an important penalty for
result greatly surprised the firm’s managers. Their
underperformance. This penalty was equal to five
initial impression was that risk exposure was much
times the total value of the contract. Doing so elicits
greater with their Y2K project than with the partnership
greater effort from the supplier and serves as a type of
one. They agreed with the results presented in Figure

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 19
3 and realized that their evaluation was inaccurate. management are allocated where they are the most
Their mistaken evaluation is coherent with remarks profitable.
made by March and Shapira (1987). Managers often
perceive risk because some potential losses are Case 3: Risk Management and Contract
perceived to be high, failing to recognize that the Negotiation
probabilities of such losses (in project 1) are dim.
The third case takes place in a large organization that
Another explanation resides in the time frame of the
employs over 15,000 people. This company had an
two projects. Consequences from problems with
unprofitable division it wanted to sell. To entice buyers
project one were almost immediate (January 2000).
to take this division, the company offered an
On the other hand, project 2 was a long term venture
outsourcing contract for another service. The company
and many potential undesirable outcomes would only
evaluated that the outsourcing contract could be very
unfold in a 2- to 5-year horizon. This might explain why
profitable and would lure suppliers to make a joint offer
project 2 was perceived as less risky. The risks
to buy the unprofitable division along with taking the
involved were not recognized because they were too
contract. The company negotiated with a first supplier
distant.
(see Figure 4). The price negotiated for both the sale
Insights. This case teaches us three main lessons. of the division and the outsourcing contract seemed
First, conducting a formal assessment of risk adequate.
exposure, and explicitly mapping the risk exposure
The contract was ready for signature. The only step
associated with a contract enables efficient risk
left was the evaluation by the internal risk
management. Managers can immediately target the
management group. The evaluation came as a shock
elements presenting high risk exposure and implement
to top management. The contract contained little
risk management mechanisms. Second, explicitly
protection for the client, and the selected supplier was
charting risk exposure offers a remedy to possible
far from ideal (even if it was a very large firm). The
biases in managers’ perceptions. In this case,
activities supported by the system were core and any
managers failed to recognize potential threats that
disruption could lower the quality of service drastically.
would not materialize in the immediate future. Their
Loss of competency was also feared because of the
evaluation of events with very low probabilities was
closeness to the core business. The reluctance of the
also biased. Third, by comparing projects and ordering
risk analysis group was so strong that the evaluation
them more accurately in terms of risk exposure, the
went all the way to the CEO. The deal was finally
organization can manage its outsourcing portfolio
cancelled.
more effectively, thus ensuring that efforts in risk

Figure 4. Large Corporation

20 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
A few months later, negotiations were in progress evaluation and management. The company employs
with a different supplier. The new supplier was more more than 60,000 people worldwide. Two contracts
experienced with outsourcing and with the activities are compared. The first one, labeled A, involved the
under consideration than the first one, which lowered outsourcing of IT activities of a large business unit of
the value of some of the risk factors. The contract to the company, while the second contract - B -
be signed was also significantly different. Since little concerned the whole IT organization (head office and
could be done to reduce losses (the activities divisions).
themselves were too close to the company’s core
First Contract. The first contract was the firm’s first
business for the losses to be significantly altered
major outsourcing venture, covering data centre
would a problem occur), all efforts were made to
management, telecommunications, maintenance, and
reduce the likelihood of occurrence of the negative
systems development. Risk exposure was high (see
outcomes. Risk factors were assessed and
Figure 5). Because of the extent of the contract,
mechanisms were introduced: sequential contracting
hidden service costs were found to be the major
to reduce lock-in and likelihood of contractual
threat. The main feature of the contract, in terms of
amendments; the setting up of an arbitration structure
risk management, was to rely on a consortium of three
to settle differences in points of view; and a one-year
vendors to supply the services. Also, the contractual
benchmark period to develop a complete set of
framework enabled the company to renegotiate
measures, which would reduce the likelihood of
several clauses annually, further reducing this risk.
hidden costs.
Disputes and litigation, costly contractual
Insights. This case illustrates two elements. First,
amendments, and loss of organizational competency
formal risk analysis helps companies sign better
were next in order of importance in terms of risk
contracts with more appropriate service providers. In
exposure. The firm recognized that disagreements
this case, both contracts were comparable in terms of
would probably arise both between the suppliers
prices and outcomes (selling the unprofitable
themselves, and between the client and its suppliers.
division). However, the contract that was finally
The client consequently tried to reduce the impacts of
signed had a much better chance of delivering the
disputes and litigation but found out that European
anticipated benefits. The client used the information
antitrust laws prevented the three suppliers from
from the first risk analysis exercise to better select
joining in a formal alliance as originally planned. Given
the second service provider. The other interesting
the type of contract selected (multiple sourcing),
element is the illustration of the endogenous
contractual amendments and contract renegotiation
component of IT outsourcing risk exposure. In this
would presumably be limited by the portfolio of
case, after receiving the analysis from the risk
activities of each supplier, thus limiting the extent of
management group, the division manager used
changes. Loss of innovative capacity (competency)
several tactics to prevent the first deal. From a
was considered the biggest potential loss resulting
distance, his behavior could be interpreted in two
from moving so many staff out of the organization,
ways: by warning everyone and pointing out the high
especially since the company had decided to become
risk involved, he was serving the organization well and
a knowledge organization. Again, the consortium was
protecting himself at the same time. He knew that top
the means whereby the firm could reduce this risk
management really wanted the deal to be signed
because several suppliers would give access to a
(mostly because it wanted to get rid of the unprofitable
broader array of innovative services (and knowledge)
division). However, if the outsourcing deal went sour,
than a single supplier. However, no supplier would
the division responsible for the activities supported by
have the big picture of the industry and the technology
the outsourced system would suffer and he would be
portfolio.
blamed later on. By preventing the first contract and,
later, going ahead with a better one, with a different Second Contract. In 1998, the company changed its
supplier, the division manager reduced the risks for the outsourcing strategy radically and decided that a
overall organization, for his division, as well as for single supplier should replace the multiple sourcing
himself. strategy it had previously adopted, thus replacing a
fragmented assortment of suppliers by a single
Case 4: Risk Management as a Series of strategic partner.It evaluated that only two suppliers in
Compromises the world were capable of providing services on such a
scale. Therefore, a costly lock-in situation could easily
The fourth case study was conducted in a large firm in
develop. To alleviate the potential problems due to the
the energy sector. This case illustrates how risk
lock-in situation, the client included a one-year notice
management and learning can eventually transform
of termination, to help reduce the impact of a potential
risk into a “choice” rather than a “fate”. This firm has
lock-in.
extensive outsourcing experience, and a history of risk

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 21
Figure 5. Energy

Other potential undesirable outcomes were cost development from the agreement. Development
escalation and transition costs. The factors linked to activities are more uncertain, more specific, and more
cost escalation suggested a low probability of complex than operations. By keeping them outside
occurrence. The company had extensive expertise the portfolio of outsourced activities, the managers
and experience with outsourcing, the supplier was reduced the probabilities of occurrence of several
very experienced with the activities included in the undesirable consequences.
contract, and was very competent in managing
The second lesson is the notion that risk is a choice.
contractual relationships. The most threatening factor
The case showed that risk profiles can be seen as
was the presence of measurement problems. One of
compromises. A given risk management mechanism
the tools to reduce them would be systematic
could lower one type of risk while increasing another
benchmarking. Transition costs could also bring
one. For example, when the client decided in the
severe penalties. They would come with service
second contract to deal with a single supplier, risks
deterioration and business disruption. Transferring
related to measurement problems were less probable.
activities to the supplier presented different risks in
However, this was done at the expense of an
different regulatory situations (different countries). To
increase in the risk of lock-in. As managers become
reduce the transition-related problems, the client
more aware of the control they have on the risk profile
increased the planning efforts in a wide variety of
of ther IT outsourcing strategy, they should bear more
aspects. Interestingly, the overall cost of transition
responsibility over the outcomes.
was not necessarily reduced, but the unexpected part
of it was.
Case 5: Risk Management and Contract Design
Insights. This case provides two lessons. First, it is
The last lessons come from the outsourcing contract
clear that learning occurred through the management
of Niagara (name changed) (Aubert et al., 1999b).
of the first contract, which translated into both lower
Niagara is a large Canadian Crown corporation,
probabilities for the undesirable consequences and
employing more than 50,000 people, with an annual
better risk management strategies in the second
income of over $5 billion. It concluded a complex
contract. Many of the contractual choices were made
outsourcing arrangement with three suppliers. When
with less naivety. Managers were more realistic about
it decided to outsource its IT services, the
potential loopholes in the arrangements and were
organization was extensively developing new
more aware of the limits of contracts. A key decision
software (using over 1,000 full-time employees) and
in the second contract was to remove software
having a hard time doing so.

22 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
Figure 6. Niagara

Although Niagara recognized that IT could radically costly contractual amendments would lead to severe
change the way it did business, the corporation felt losses. In the case of such a large contract, it would
that IT and software development were not within its be tempting for a supplier to argue higher than
core competency. The organization had problems expected costs and renege on the promised fees.
hiring and retaining IT people. It was dealing with a Changes to any contract would also be probable
vast number of consultants, without taking advantage because of the wide variety of services and the level
of the consultants’ distinctive skills. Finally, Niagara of innovation in the field.
felt that some of the software solutions developed
Niagara had some precious resources when
were innovative and could be sold to other similar
considering outsourcing. Most notably, the
organizations in the world. However, it did not have
organization had a long tradition of measurement.
the skills nor the infrastructure to do so. Selling
Every activity in the organization was measured, and
software was not its business.
the organization had impressive charts and data
The level of risk exposure associated with about the resources required for developing or
outsourcing all IT services, as intended, was high operating software. There existed measurement
(see Figure 6). Lock-in was the most important threat. guidelines for all types of applications, based on the
Because of the sheer size of the contract, lock-in vast number of projects done by the organization or
could be very costly. The probability of a lock-in was sub-contracted. The organization also had enough
also high, mostly because of the highly specific nature internal data to benchmark potential suppliers.
of the software developed, and the limited number of
Niagara finally signed an original contract, integrating
suppliers that could handle such a large contract.
several risk management mechanisms. It decided to
Hidden costs were also to be feared. The complexity
rely on a multiple sourcing strategy and retained three
of the activities, the number of different systems to
suppliers. Each one was responsible for a given
integrate, and the scope of the contract made hidden
portfolio of activities and had an area of responsibility.
costs a likely menace. Similarly, cost escalation and

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 23
Hence, the portfolios partially overlapped. As a result, interest. When two stages of production are not
this outsourcing strategy placed the three outsourcers independent an agent may be motivated to perform
in the unusual situation of having to cooperate and/or better if it is responsible for both stages. By putting
compete on almost every project. Each service extra effort into the first stage, it will reduce the effort
provider had a group of activities allotted to them. For required at the subsequent stage. Inversely, by
any new project, Niagara asked one of its three shirking during the first stage, it will increase the effort
suppliers for a cost estimate. This estimate was required later. As a result, the agent cannot claim to
compared to internally prepared estimates and, if have made an excessive effort at both stages.
acceptable, the contract could be given without
Monitoring was used extensively to reduce the risk of
further delay. If unsatisfactory, Niagara could ask the
service debasement. Each deliverable done by a
other suppliers to bid. External bids could also be
supplier had to be approved by one of the two other
sought. The three suppliers were chosen because
suppliers. Once a piece of work was approved, the
they had a much better knowledge of the
supplier approving the work became responsible for
organization. Yet, they still had to remain honest to
its judgment (and for handling the costs related to
retain their share of the overall IT activities.
problems). An interesting result of this type of
Another element of interest was the outside deals. arrangement was that Niagara automatically obtained
The suppliers had the infrastructure to sell, outside of a third party view of each supplier's work.
Canada, the software developed. In fact, the
Insights. This case teaches two lessons. First, risk
outsourcing deal created a partnership between
can be managed and efficient contract design can
Niagara and its suppliers which allowed them to do
drastically reduce residual risk. In many ways,
so. At the time the case was written, the technology
Niagara was able to implement several of the features
was being transferred to eight countries. Neither the
that the energy producer (see Case 4 above) wanted
client nor the outsourcers would have had the
to include in its first contract. Because the regulatory
capacity to market the technology alone. The
regime in Canada is different than the European one,
reputation of Niagara and the skills of the suppliers
there was no obstacle to such contract design. One
were essential elements in the success of the joint
key element of this contract is that risk is not
sales abroad. These external deals were extremely
eliminated; it is mostly transferred to the suppliers.
attractive for the outsourcers. While they were a
They become responsible for many of the potential
source of revenue, they also served as great goal
undesirable outcomes that can occur. They are
alignment mechanisms between Niagara and its
positioned in a way that makes them guardians of the
suppliers, reducing potential “cultural” differences at
other suppliers on behalf of Niagara. Such risk taking
the same time. They acted as a bond, guaranteeing
is unusual for the suppliers. When Niagara proposed
satisfactory service to the client.
this agreement to several potential suppliers, many
The competition between the suppliers reduced the declined to bid. This further reduced the number of
expected losses associated with hidden costs and potential suppliers, which explains why the probability
cost escalation, so did benchmarking. Before of lock-in increased (while the potential loss
undertaking a new software project, key indicators decreased because of the dividing of the portfolio into
such as cost per milestone, total development cost, three parts).
elapsed time, and total cost minus fixed assets, were
The other lesson is that size does matter. This
used to assess it. These parameters were clearly
sophisticated contract would not have been possible if
specified ex ante so the suppliers knew how they
the portfolio of activities had been smaller. The
were being evaluated. Activities were measured on a
suppliers agreed to enter into this relationship
regular basis, graphing the number and types of
because they expected to make money. They
problems, their category according to security level,
accepted to shoulder more risk than they usually do
and their overall impact. Also, by separating the
because they anticipated greater benefits. Each one
portfolio into three parts, any cost escalation due to
dedicated approximately 350 employees to the
opportunistic behavior of a supplier would be limited
contract with Niagara. Moreover, the outside deals
to a third of the overall portfolio.
were a powerful incentive. These other contracts
Cost escalation was also limited by the use of made the relationship with Niagara especially
countervailing incentives. In their dealings with precious and guaranteed the client that suppliers
Niagara, the outsourcers were responsible for the would not threaten this relationship. All that machinery
maintenance of the systems they had developed. is economically justifiable only if the size of the
Consequently, they had a strong incentive to develop contract is significant.
efficient systems, so as to minimize their maintenance
efforts. Linking two stages of production can provide
an incentive for an agent to perform in the principal's

24 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
Conclusion likelihood of a dispute) was done in another (case 4),
while efforts on measurement were made in a third
The research showed that the combination of insights instance (case 5). These efforts, clarifying the
from Transaction Cost and Agency Theories into the contract assessment, also reduced the probability of
structure provided by the risk framework enable disputes. Service debasement was managed through
interesting predictions about likely contract outcomes, the use of detailed measurement and benchmarking
and suggested possible improvements or alternatives (case 4), paired with penalties (case 2). These
to contract structures. The case analyses also enabled the client to adequately ensure that the
provided support for the usefulness of the reference supplier delivered the promised quality. To prevent
theories. increased costs of services, measurement was used
The cases reported here illustrate how the proposed (case 2), along internal competition (cases 2 and 5).
risk framework helps to understand the components This competition enticed the suppliers to lower their
of risk exposure of an outsourcing project and the prices in order to secure a significant share of the
mechanisms of risk management. First, risk analysis business. From the cases, it seems that little can be
helps anticipating problems and select appropriate done to prevent the loss of competency. No
contract types, that take into account the mechanisms proved effective to reduce this risk. It
characteristics of the activities considered for appears that the only way to prevent such loss is to
outsourcing. Moreover, this framework allows for the carefully select the activities to outsource. Not
correction of some of the managers’ biases. It is outsourcing core activities might be the only manner
difficult to compare alternatives that are associated in which a party can secure its core competency.
with both different probability distributions and Finally, hidden costs were handled efficiently in case
different loss functions. The human mind can only 3, where the client imposed a one-year transition
deal with a limited number of scenarios and a formal period over which client and supplier defined the
analysis ensures that all key elements are taken into adequate levels of performance and their measure.
account. Organizing the information in a structured The client kept a retracting clause over the transition
framework facilitates the managers’ evaluation. period, allowing it to return to internal governance if it
was not satisfied with the supplier’s performance.
The cases generated several results. First, the use of
the list of risk factors was found to be a useful means This suggests that outsourcing risk is largely a matter
of providing information about the probability of of choice: it is very much an endogenous risk.
occurrence of the undesirable outcomes. This Managers clearly have a choice between different
supports the idea that risk is measurable and that sourcing strategies, between outsourcing and doing
contracting strategies can be adjusted accordingly. internally, and between numerous contracts for any
Several of these contracting strategies were given activity. When selecting any one of them, they
illustrated with the five cases. should be aware of what they are selecting, and what
they are discarding. Risk exposure, once made
This is probably the most interesting result from the explicit, transforms the unexpected into an option
case studies. Looking at the cases, it is possible to selected consciously. These selections are always
suggest appropriate strategies for each undesirable compromises. Most risk management mechanisms
outcome. In order to deal with unexpected transition involve reducing some types of risk while increasing
and management costs, companies adopted others, or accepting to pay a fee to reduce a given
extensive planning (case 2) and transferred some of risk. The comparison of the risk reduction introduced
these costs, through contract structure, to their by the management mechanisms with the cost of
suppliers (case 5). It is interesting to note, with such mechanisms (for example when a company
respect to this outcome, that the management decides to deal with two suppliers instead of one)
mechanism did little to reduce the absolute cost but makes the assessment of the real value of the
focused more on correctly anticipating such costs, or management mechanism easier.
transferring them to another party. To avoid lock-in,
multiple suppliers were used (cases 2 and 5), which One limit of this study is that the cost of these
prevented from being made prisoner of one. These mechanisms is not made explicit. Because the scale
two cases also showed that costly contractual on which the consequences are evaluated is not in
amendments could be managed by using sequential absolute dollars, and because the cost of each
contracts. These contracts, redefined over time within mechanism is not in dollar terms either, it is difficult to
a general agreement, enabled the parties to deal with evaluate if each of the management mechanisms
the inherent uncertainty of long term arrangements. implemented was worth its cost. The reduction in the
Disputes and litigation were managed through risk exposure has to be compared with the incurred
different mechanisms. Arbitration was adopted in one fixed costs.
case (3), formal culture evaluation (to reduce the

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 25
Another limit is the lack of consideration for the risk International Conference on System Sciences,
aversion profile of the managers. If managers are risk Hawaii, 10 pages (CDROM).
adverse, they will be inclined to adopt more risk Aubert, B. A., Patry, M., and Rivard, S. (1997). “The
management mechanisms than might be required. By Outsourcing of IT: Autonomous Versus Systemic
omitting this consideration, we presumed, when Activities”, 28th Annual Meeting of the Decision
evaluating the cases, that expected value of risk, as Sciences Institute, San Diego, CA, pp. 809-812.
risk exposure is defined, could be applied directly, Aubert, B. A., Patry, M., and Rivard, S. (1999b).
without considering the managers’ utility functions. “Impartition des Services Informatiques au
Canada : Une Comparaison 1993-1997,” in
These limitations suggest an interesting path for
Poitevin, M. (Ed.), Impartition Fondements et
research. First, it will be interesting to refine the
Analyse, Québec, Presses de l’Université Laval,
evaluation in order to achieve an evaluation, in dollar
pp. 203-220.
terms, of the potential losses and of the management
Aubert, B. A., Patry, M., Rivard, S., and Smith, H.
mechanisms. This would enable a very formal
(2001). “IT Outsourcing Risk Management at
assessment of the scenarios and a risk/benefit
British Petroleum,” Proceedings of the Thirty-
evaluation of each management mechanism. Once
fourth Hawaii International Conference on
such a measure available, it would be possible to use
Systems Sciences, Hawaii, 10 pages (CDROM)
simulation to measure the risk profile of the managers
Aubert, B. A., Patry, M., and Rivard, S. (2003) “A
and to formally evaluate their preferences between
Tale of Two Outsourcing Contracts - An Agency
different scenarios.
Theoretical Perspective,” Wirtschaftsinformatik,
In conclusion, once risk exposure is made explicit, Vol.45, No.2, pp.181-190.
and the possible compromises rendered clear to the Aubert, B. A., Rivard, S., and Patry, M. (1996). “A
managers, risk becomes a lot more manageable. In Transaction Cost Approach to Outsourcing
fact, the cases presented and analyzed here also Behavior: Some Empirical Evidence,” Information
suggest that assessing and managing outsourcing and Management, Vol.30, pp. 51-64.
risk can pay off: it leads to lower residual risks or to Aubert, B. A., Rivard, S., and Patry, M. (1996b).
a greater performance through better contract “Development of Measures to Assess
design. One caveat to keep in mind is that the costs Dimensions of IS Operation Transactions,”
associated with these risk reduction measures were Omega, International Journal of Management
not assessed in the cases. Risk management is Science, Vol.24, No.6, pp. 661-680.
generally a complex exercise and its conclusions are Barki, H., Rivard, S., and Talbot, J. (1993). “Toward
often far from precise. Yet, it provides valuable an Assessment of Software Development Risk,”
information and increases the quality of the decision- Journal of Management Information Systems,
making process. Not surprisingly, organizations with Vol.10, No.2, pp. 203-225.
a lot of resources, awarding larger contracts, will Barzel, Y. (1982). “Measurement Cost and the
have more flexibility when managing their risk Organization of Markets,” Journal of Law and
portfolio and greater possibilities to reduce their risk Economics, Vol.25, No.1, pp. 27-48.
exposure. Boehm, B.W. (1991). “Software Risk Management:
principles and practices”, IEEE Software, Vol.8,
References No.1, pp.32-41.
Alchian, A. A. and Demsetz, H. (1972). “Production, Bowers, L. N., Gerber, U. H., Hickman, C. J., Jones,
Information Cost, and Economic Organization,” A. D., and Nesbit, J. C. (1986). Actuarial
American Economic Review, Vol.62, No.5, pp. Mathematics, Itasca: The Society of Actuaries.
777-795. Cheung, S. (1983). “The Contractual Nature of the
Anderson, J. and Narasimhan, R. (1979). “Assessing Firm,” Journal of Law and Economics, Vol.26,
Implementation Risk: A Methodological No.1, pp. 1-21.
Approach,” Management Science, Vol.25, No.6, Coase, R. (1937). “The Nature of the Firm,”
pp. 512-521. Economica, Vol.4, No.16, pp. 396-405.
Arrow, K. (1983). “Behaviour Under Uncertainty and Crocker, K. and Masten, S. (1991). “Pretia ex
Its Implications for Policy,” in Stigurn, B., and Machina? Prices and Process in Long-Term
Wenslop, F. (Eds.), Foundations of Utility and Contracts,” Journal of Law and Economics,
Risk Theory with Applications, Dordrecht, Vol.34, No.1, pp. 69-99.
Holland: Reidel Publishing Company, pp. 19-34. Cross, J. (1995). “IT Outsourcing: British
Aubert, B. A., Dussault, S., Patry, M., and Rivard, S. Petroleum's Competitive Approach,” Harvard
(1999a). “Managing the Risk of IT Outsourcing,” Business Review, No.3, pp. 95-102.
Proceedings of the Thirty-Second Hawaii Dorn, P. (1989). “Selling One's Birthright,”
Information Week, Vol.241, pp. 52.

26 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)
Earl, M. J. (1996). “The Risks of Outsourcing IT,” Nam, K., Rajagopalan, S., Rao, H. R., and
Sloan Management Review, Vol.37, No.3, pp. 26- Chaudhury, A. (1996). “A Two-Level Investigation
32. of Information Systems Outsourcing,”
Eisenhardt, K. (1989). “Agency Theory: An Communications of the ACM, Vol.39, No.7, pp.
Assessment and Review,” Academy of 37-44.
Management Review, Vol.14, No.1, pp. 57-74. Nelson, P., Richmond, W., and Seidman, A. (1996).
Grossman, S. and Hart, O. (1986). “The Costs and “Two Dimensions of Software Acquisition,”
Benefits of Ownership: A Theory of Vertical and Communications of the ACM, Vol.39, No.7, pp.
Lateral Integration,“ Journal of Political Economy, 29-35.
Vol.94, pp. 691-719. Nilakant, V. and Rao, H. (1994). “Agency Theory
Holmstrom, B. (1979). “Moral Hazard and and Uncertainty in Organizations: An Evaluation,”
Observability,” Bell Journal of Economics, Vol.10, Organization Studies, Vol.15, No.5, pp. 649-672.
No.1, pp. 74-91. O'Leary, M. (1990). “The Mainframe Doesn't Work
Joskow, P. L. (1987). “Contract Duration and Here Anymore,” CIO, Vol.6, No.6, pp. 77-79.
Relationship-Specific Investments: Empirical Perry, M. K. (1989). “Vertical Integration:
Evidence from Coal Markets,” American Determinants and Effects,” in Shmalensee, R.,
Economic Review, Vol.77, No.1, pp. 168-185. and Willig, R. (Eds.) Handbook of Industrial
Kobs, A. (1998). “Sentinel Events – A Moment in Organization, Amsterdam: North-Holland, pp.
Time, A Lifetime to Forget,” Nursing 183-255.
Management, Vol.29, No.2, pp. 10-13. Prahalad, C. V. and Hamel, G. (1990). “The Core
Lacity, M. C. and Hirschheim, R. (1993). Information Competence of the Corporation,” Harvard
Systems Outsourcing, New York: John Wiley & Business Review, Vol.68, No.3, pp. 79-91.
Sons. Roy, V. and Aubert, B. A. (2000). “A Resource
Lacity, M. and Hirschheim, R. (1995), Beyond the Based View of the Information Systems Sourcing
Information Systems Bandwagon: The Insourcing Mode,” Proceedings of the 33rd Hawaii
Response, Chichester: Wiley. International Conference on Systems Sciences,
Lacity, M. C., Willcocks, L. P., and Feeny, D. F. Maui, Hawaii, 10 pages (CD ROM).
(1995). “IT Outsourcing: Maximize Flexibility and Sappington, D. (1991). “Incentives in Principal-agent
Control,” Harvard Business Review, Vol.73, No.3, Relationships,” Journal of Economic
pp. 84-93. Perspectives, Vol.5, No.2, pp. 45-68.
Langlois, R. N. and Robertson, P. L. (1992). Schirripa, F. and Tecotzky, N. (2000). “An Optimal
“Networks and Innovation in a Modular System: Frontier,” The Journal of Portfolio Management,
Lessons from the Microcomputer and Stereo Vol.26, No.4, pp. 29-40.
Component Industries,” Research Policy, Vol.21, Simon, H. A. (1991). “Organizations and Markets,”
pp. 297-313. Journal of Economic Perspectives, Vol.5, No.2,
Lee, J. N. and Kim, Y. G. (1999). “Effect of pp. 25-44.
Partnership Quality on IS Outsourcing: Teece, D. J. (1986). “Firm Boundaries,
Conceptual Framework and Empirical Validation,” Technological Innovation and Strategic
Journal of Management Information Systems, Management,” in Thomas, L. (Ed.), The
Vol.15, No.4, pp. 29-61. Economics of Strategic Planning, Lexington, M
Levin, M. and Schneider, M. (1997). “Making the A: Lexington Books, pp. 187-199.
Distinction: Risk Management, Risk Exposure,” Teece, D. J., Rumelt, R., Dosi, G., and Winter, S.
Risk Management, Vol.44, No.8, pp. 36-42. (1994). “Understanding Corporate Coherence,
Levine, E. (2000). “Defining Risks,” CA Magazine, Theory and Evidence,” Journal of Economic
Vol.133, No. 3, pp. 45-46. Behavior and Organization, Vol.23, pp. 1-30.
Linerooth-Bayer, J. and Wahlstrom, B. (1991). Williamson, O. E. (1985). The Economic Institutions
“Applications of Probabilistic Risk Assessments: of Capitalism, New York: The Free Press.
the Selection of Appropriate Tools,” Risk Williamson, O. E. (1989). “Transaction Costs
Analysis, Vol.11, No.2, pp. 239-248. Economics,” in Shmalensee, R., and Willig, R.
Mahoney, D. (1988). Confessions of a Street-Smart (Eds.), Handbook of Industrial Organization,
Manager, New York: Simon & Shuster. Amsterdam: North-Holland, pp. 136-178.
March, J. and Shapira, Z. (1987). “Managerial
Perspectives on Risk and Risk-Taking,”
Management Science, Vol.33, No.11, pp. 1404-
1418.

The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4) 27
About the Authors organization, Dr Patry’s recent work covers the areas
of outsourcing and delegated management, the
Benoit A. Aubert is Professor and Director of economics of IT, the analysis of regulation and
Research at HEC Montreal and Fellow at the contracts, and the impact of regulation on productivity.
CIRANO (Center for Interuniversity Research and
Analysis on Organizations). His main research areas Suzanne Rivard is Professor and holder of the Chair
are outsourcing, ERP implementation, and risk in Strategic Management of Information Technology
management. He also published papers on trust, at HEC Montréal. Her research interests encompass
ontology, and health care information systems. He is ERP implementation, outsourcing, software project
currently investigating the links between corporate risk management, and strategic alignment. She
strategy and outsourcing. published in Communications of the ACM, Data Base,
Michel Patry is Professor at the Institut d’économie Information and Management, Journal of Information
appliquée of HEC Montréal and CEO of CIRANO Technology, Journal of Management Information
(Center for Interuniversity Research and Analysis on Systems, MIS Quarterly, Omega, and others.
Organizations). A specialist of the industrial

28 The DATA BASE for Advances in Information Systems - Fall 2005 (Vol. 36, No. 4)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy