Information Systems Security and Human Behaviour PDF
Information Systems Security and Human Behaviour PDF
To cite this article: Denis Trček , Roman Trobec , Nikola Pavešić & J. F. Tasič (2007) Information systems security and human
behaviour, Behaviour & Information Technology, 26:2, 113-118, DOI: 10.1080/01449290500330299
Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained
in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no
representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the
Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and
are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and
should be independently verified with primary sources of information. Taylor and Francis shall not be liable for
any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever
or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of
the Content.
This article may be used for research, teaching, and private study purposes. Any substantial or systematic
reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any
form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://
www.tandfonline.com/page/terms-and-conditions
Behaviour & Information Technology, Vol. 26, No. 2, March – April 2007, 113 – 118
{Department of digital communications and networks, Jožef Stefan Institute, Ljubljana, Slovenia
{Faculty of Electrical Engineering, University of Ljubljana, Ljubljana, Slovenia
Until recently, most of the effort for providing security in information systems has been
focused on technology. However, it turned out during the last years that human factors
have played a central role. Therefore, to ensure appropriate security in contemporary
information systems, it is necessary to address not only technology-related issues, but also
Downloaded by [Northeastern University] at 08:09 03 December 2014
human behaviour and organisation-related issues that are usually embodied in security
policies. This paper presents a template model, which is intended to support risk
management for information systems, and which is concentrated on human factors.
The model is based on business dynamics that provide the means for qualitative and
quantitative treatment of the above-mentioned issues.
And this is where business dynamics comes in (Sterman security-related tasks and security policy levels. Further,
2000). It has its origins in the field of engineering, from the model is simulated in a deterministic way, i.e. no
where the basic ideas have been tailored to the needs of variables of a stochastic nature are used. This is done
management sciences. Business dynamics addresses people, intentionally to present better the main patterns and
processes, material and information flows by emphasising behaviour of the system. Of course, stochastic variables
the importance of feedback loops. According to business can be used and sensitivity analysis performed, which is
dynamics, these feedback loops are the major cause for especially necessary for risk analysis.
the behaviour of systems. Further, this methodology also
provides the means to address non-linear systems dy-
3.1 Modelling human perception
namics, which governs many real life phenomena.
The system modelling starts with a qualitative approach. The central point is the modelling of human behaviour and
Through iteration stages, the model is formed with the use real risks, where human perceptions are the most challen-
of causal loops. Therefore, these models are also called ging part. They are often modelled as adaptive learning
causal loop diagrams, which consist of variables, connected processes with exponential smoothing, which is also
with appropriate links. Links have positive polarity if an common in economic models.
increase of causal variable results in an increase of the In adaptive expectations, the rate of change in a
consecutive variable. If the output variable is decreased by perceived value X* is given by a first order, linear
Downloaded by [Northeastern University] at 08:09 03 December 2014
an increase of input variable, the polarity is negative. differential equation dX*/dt ¼ (X-X*)/D, where X is the
Among variables, there is a distinct set called levels or reported input variable (current value), while D is a time
accumulators. Levels play a special role in the system. First, constant, also called adjustment time. It has been proven
they are the source of inertia. Second, they constitute a kind that adaptive expectations frequently outperform other
of primitive memory within the system – an aggregate of methods for forecasting (Makridakis et al. 1986) and they
past events. Third, they serve as absorbers, and decouple will therefore be included in our modelling process.
inflows from outflows (flows or rates are another important A reader should note that belief is modelled as a stock,
kind of variable, which are coupled with levels). Thus when because perception is a state of mind that remains
a certain variable has the above-mentioned properties, it unchanged unless there is a reason for a change – in case
has to be addressed explicitly as a level. of adaptive expectations this change is the discrepancy
Having a causal loop diagram, one gets a holistic view of between real and perceived value.
the system, which enables a better understanding of the
basic principles of its functioning. Such a model can be
3.2 Threats modelling
further developed, or better, enhanced into a quantitative
one. This means that concrete relationships between 3.2.1 Ordinary users and threats. Regarding threats, the
variables are established by the introduction of appropriate generic model (threats generator) is shown in figure 1. The
equations. The equations often have to contain translation explanation is as follows. Threats actually appear at a
parameters or scaling factors to tune the system so that it certain rate, and once they have appeared, they remain
behaves in a way that real input data closely match values on the scene for a certain period of time. Changes in
of the real output data. technology cause threats to cease sooner or later, but more
frequently, improvements of technology (e.g. software
patches) or enhanced operating procedures eliminate them.
3. Derivation of the model
In every case, new threats are added to existing ones and
As described above, business dynamics methodology is of a the rate of newly generated threats is decoupled from the
general applicability for socio-technical systems. IS cer-
tainly fall into this domain and it makes sense to think
about them in a systemic way when designing and
implementing such systems (Dutta 2003). Additionally, IS
security with an emphasis on human factors is an even
more appropriate area for the use of business dynamics,
although its use in this particular field is rather new and
rare – an example of a pioneering attempt is given in
Gonzalez 2003.
In the next section we will present a model for security
policy management with an emphasis on human behaviour.
The model includes essential variables, such as risks,
perception of risks (i.e. adaptation processes), intensity of Figure 1. Generic model of threats.
Information systems security and human behaviour 115
the appendix. time (IAT) and length of normal operation (LNO). The
reason is that RAT depends on particular circumstances.
3.2.2 IT professionals and threats. Although the above Assume there is a longer period without any accidents.
model of threats will be used as a direct driving factor for Now when a new accident takes place, expectations are
human behaviour in our central model, it should be based on past experiences, which means that people
emphasised that IT professionals have a different view on perceive this event more or less as an isolated one. It is
threats than ordinary users. quite a different situation if one experiences attack after
This view only starts with threats (see figure 2). Threats attack for a longer period of time. In this case, one will
are further elaborated from the perspective of their assume a similar frequency of accidents when the next
probability of realisation with respect to a particular asset. strike takes place, even if this one belongs to the beginning
Further, each asset has to be judged in terms of its value of a period with a smaller frequency of attacks.
and vulnerability to a particular threat. This altogether Last but not least, there are two delays to denote the fact
leads to risks that serve as a basis for countermeasures, that security policy level, as well as discrepancy, is always
where some residual risk has to be taken into account. subject to delayed information. Due to these two variables,
Thus risks actually present the driving force of the central the effects of delays can be analysed. The whole model is
model. presented in figure 3.
The model can also be seen from the loops perspective, Now assume the situation where humans adapt slowly to
where three balancing loops can be noted: changes, which is reflected in high values for RAT (this
value is 10 in figure 6). One would expect similar effects to
1. The upper left loop is the loop of perceived risk (PT, those in figure 5. Indeed, SPL oscillates even more
discrepancy, CPT).
2. There is a trust loop in the upper right corner (PT,
IAF, LNO, RAT, CPT). It represents the trust of
humans (employees) in the system according to
experienced operational patterns, which influence
RAT.
3. The last one is the adjustment loop (PT, IAF, SPL,
CPR). It is positioned at the bottom and it models
adjustment of perceived risk caused by the manage-
ment through SPL.
intensively, due to slow adaptation of PT, which, on the organisational barriers may delay installation of a patch,
other hand, also results in a higher level of successful not to mention motivation factors. Indeed, the results of an
attacks. The security manager can decide to influence analysis related to patch installation delays (Editorial 2002)
the whole situation by increasing corrective intensity support the letter claim (in case of Slapper worm, only 16
and, consequently, SPL. The simulation proves these per cent of administrators installed the patch immediately,
expectations. while in the first week, the a total of 23 per cent installed the
The security manager can further investigate the system patch). Unfortunately, such surveys are still very rare, but
by posing various questions, e.g. ‘Would change of initially will be needed to more accurately deal with our problem.
perceived threats result in better performance under the Similarly it holds true for the influence of SPL on CPT
current worst conditions, given in figure 6?’ The answer is – and LNO regarding IAF. Especially with the former,
yes. It pays off to report to security administrators initial research in other related fields has to be taken into account,
values for threats that are slightly higher than the real ones which addresses behavioural cybernetics (Haims 1998).
(see figure 7). An interesting side-effect of this measure is Further, with relation to proper risk management that
that oscillations in SPL practically disappear. The reader has been reduced to threats in this paper, a suitable
should note that the model is valid only within a certain taxonomy of threats and vulnerabilities should be consid-
range of given variables – in this case, large initial values ered (Carayon 2003).
would cause employees to underestimate threats, if they Regarding system dynamics, one should note that this
Downloaded by [Northeastern University] at 08:09 03 December 2014
discovered afterwards that they were reporting initial methodology operates on an aggregates level. This brings
values that were too high. some limitations, especially for SPL modelling. It is hard to
The basic model discussed so far can be further enhanced assume that in the near future security policy management
to reflect, for example, fatigue effects due to high SPL or will be defined to an extent, which will enable uniform
the fact that SPL is adjusted in a discrete manner. treatment in terms of security policy levels. Nevertheless,
Nevertheless, it can be seen how such models gradually ISO 17799 presents the first steps in this direction.
evolve to their mature state, how they provide rich insight
into security of IS and how they can give a valuable
feedback for setting up a security policy.
6. Conclusions
It has become evident during recent years that technology
5. Current limitations of the model and further research
alone will not and cannot provide adequate security of
directions
information systems. The main and most important factor
The part of the model that includes RAT, CPR, PT and behind ensuring security is humans. In each and every IS
discrepancy is based on scientifically sound research that there is a complex interplay between technology and the
was already mentioned above (Makridakis et al. 1984). human factor. Therefore, appropriate methods should be
However, the basic premise of the current model is that available to arm those responsible for security to address
breaches should decline, if threats are perceived in a timely these issues more rigorously.
manner and punctually regarding their quantity and It has been explained in this paper why and how business
nature. This assumption is not likely to hold true in the dynamics can be used to address these issues. The paper
majority of cases. Even if one correctly perceives threats, provides an approach that is intended to serve as a template
model for managing security-related issues in real ISs. It
identifies the basic variables and their relationships.
Further, certain sub-parts of this model already have a
sound basis, while those that need further research, are
clearly defined. Once the research has been done that is
needed to quantitatively manage the latter sub-parts, the
complete validation of the model will be done. But even at
this stage, the model may be used as an interactive and
visual tool in security awareness programs to roughly
demonstrate the complex interplay between technology and
organisational issues.
Summing up – proper handling of complex issues
discussed in this paper is certainly hard and should not
be left to the intuition of participating parties. There are
Figure 7. Effects of increased IPT (RAT is divided by 10 to many other important issues that such an approach can
fit on the graph). bring to light, for example oscillations. In our case,
118 D. Trcˇek et al.
oscillations are originally caused by the nature of threats. been simulated with VensimTM package with the simulation
But one should note that even if the original functions are increment set to 0.015625 weeks:
not of an oscillating nature, oscillations can appear due to
delays in the system. Oscillations are undesirable effects, as . adjustment weight ¼ 1, dimensionless
they require additional efforts to drive the system in a . change in perceived threats ¼ (discrepancy þ correc-
certain direction, and they are irritating for the employees tive efficiency*security policy level)/real adjustment
since existing decisions are quickly superceded by new time, units accident/(week*week)
decisions that are of the opposite nature. . corrective efficiency ¼ 1, units accident/task
Certainly, business dynamics is not a panacea, but it can . corrective intensity ¼ 1, units task/accident
be a valuable tool for improving IS security, whether it is . discrepancy ¼ SMOOTH3 (threats, tDelay2)-per-
used as a stand-alone or as a complementary tool to other ceived threats, units accident/week
techniques and approaches. . initially perceived threats ¼ 0, units accident/week
. internal accident frequency ¼ MAX ((threats-
perceived threats), 0), units accident/week
References
. intrinsic adjustment time ¼ 1, units week
BRITISH STANDARDS INSTITUTE, 1999, Code of practice for information . length of normal operation ¼ normal period para-
security management, BS 7799, London.
meter/(1 þ internal accident frequency), units week
Downloaded by [Northeastern University] at 08:09 03 December 2014