0% found this document useful (0 votes)
177 views

Information Systems Security and Human Behaviour PDF

This article discusses how human behavior and organizational factors are important for information systems security, not just technology. It presents a model for risk management in information systems that focuses on human factors. The model is based on business dynamics, which allow for qualitative and quantitative analysis of security issues related to technology, human behavior, and organizational policies. The model is then simulated and discussed to demonstrate how it can support a more holistic treatment of information systems security that addresses these important human and organizational aspects.

Uploaded by

AsropiAsropi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
177 views

Information Systems Security and Human Behaviour PDF

This article discusses how human behavior and organizational factors are important for information systems security, not just technology. It presents a model for risk management in information systems that focuses on human factors. The model is based on business dynamics, which allow for qualitative and quantitative analysis of security issues related to technology, human behavior, and organizational policies. The model is then simulated and discussed to demonstrate how it can support a more holistic treatment of information systems security that addresses these important human and organizational aspects.

Uploaded by

AsropiAsropi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

This article was downloaded by: [Northeastern University]

On: 03 December 2014, At: 08:09


Publisher: Taylor & Francis
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,
37-41 Mortimer Street, London W1T 3JH, UK

Behaviour & Information Technology


Publication details, including instructions for authors and subscription information:
http://www.tandfonline.com/loi/tbit20

Information systems security and human behaviour


a a b b
Denis Trček , Roman Trobec , Nikola Pavešić & J. F. Tasič
a
Department of digital communications and networks , Jožef Stefan Institute , Ljubljana,
Slovenia
b
Faculty of Electrical Engineering , University of Ljubljana , Ljubljana, Slovenia
Published online: 14 May 2007.

To cite this article: Denis Trček , Roman Trobec , Nikola Pavešić & J. F. Tasič (2007) Information systems security and human
behaviour, Behaviour & Information Technology, 26:2, 113-118, DOI: 10.1080/01449290500330299

To link to this article: http://dx.doi.org/10.1080/01449290500330299

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained
in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no
representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the
Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and
are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and
should be independently verified with primary sources of information. Taylor and Francis shall not be liable for
any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever
or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of
the Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematic
reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any
form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://
www.tandfonline.com/page/terms-and-conditions
Behaviour & Information Technology, Vol. 26, No. 2, March – April 2007, 113 – 118

Information systems security and human behaviour

DENIS TRČEK*{, ROMAN TROBEC{, NIKOLA PAVEŠIĆ{ and J. F. TASIČ{

{Department of digital communications and networks, Jožef Stefan Institute, Ljubljana, Slovenia
{Faculty of Electrical Engineering, University of Ljubljana, Ljubljana, Slovenia

Until recently, most of the effort for providing security in information systems has been
focused on technology. However, it turned out during the last years that human factors
have played a central role. Therefore, to ensure appropriate security in contemporary
information systems, it is necessary to address not only technology-related issues, but also
Downloaded by [Northeastern University] at 08:09 03 December 2014

human behaviour and organisation-related issues that are usually embodied in security
policies. This paper presents a template model, which is intended to support risk
management for information systems, and which is concentrated on human factors.
The model is based on business dynamics that provide the means for qualitative and
quantitative treatment of the above-mentioned issues.

Keywords: Information systems; Security policy; Human behaviour; Business dynamics;


Modeling and simulation

this area is BS 7799 (BSI99), which was recognized by ISO


1. Introduction
as an international standard in 2000). Nowadays, IS
For a better understanding of the goals of this paper, a security is not even just a question of technology and
short historical overview of computer-based information organisation, but also of legislation, dependency and other
systems (ISs) security is needed. issues (Trček 2003).
At the start of the 1960s, efforts were concentrated on Until now the above issues have been mainly dealt with
cryptographic algorithms and secure operating systems (see qualitatively. However, this paper presents an approach
e.g. Denning 1982). Afterwards, an epoch followed in the that paves the road towards quantitative treatment of IS
mid-1980s that was marked by the penetration of computer security, addressing technology, human factors and orga-
communications, which shifted the emphasis towards nisation. There is a short overview of business dynamics in
cryptographic protocols (a typical representative in the the second section, while the third section presents the
literature of this period is Schneier 1996). In parallel with model. Its simulation and discussion is given in the fourth
this, human factors were addressed in standards for the first and fifth sections, and there is a conclusion at the end.
time (US Department of Defence 1983. The third period
started in the mid-1990s with the penetration of the Internet
2. From formal tools to business dynamics
into the business sphere, causing transformation of existing
ISs into networked, or better, web-centric ISs. For analysis of security in IT environments, formal
In this last period security issues came to the frontline, methods played an important role (notable examples are
because they were essential, due to tight coupling of Z (Spivey 1989) and BAN logic (Burrows et al. 1990)).
business processes with information technology (IT). They These methods are mostly based on discrete maths and
shifted the emphasis towards organisational and human logic. However, they are not sufficient for a holistic
behaviour-related issues. It became clear that these issues treatment of IS security with an emphasis on the human
were probably more important as the classical, technolo- factor, where one needs to model information flows,
gical ones (Schneier 2000, Denning 1999). This was also their perception, the complex interplay between IT and
recognised by standardisation bodies (the basic standard in humans, etc.

*Corresponding author. Email: denis.trcek@ijs.si

Behaviour & Information Technology


ISSN 0144-929X print/ISSN 1362-3001 online ª 2007 Taylor & Francis
http://www.tandf.co.uk/journals
DOI: 10.1080/01449290500330299
114 D. Trcˇek et al.

And this is where business dynamics comes in (Sterman security-related tasks and security policy levels. Further,
2000). It has its origins in the field of engineering, from the model is simulated in a deterministic way, i.e. no
where the basic ideas have been tailored to the needs of variables of a stochastic nature are used. This is done
management sciences. Business dynamics addresses people, intentionally to present better the main patterns and
processes, material and information flows by emphasising behaviour of the system. Of course, stochastic variables
the importance of feedback loops. According to business can be used and sensitivity analysis performed, which is
dynamics, these feedback loops are the major cause for especially necessary for risk analysis.
the behaviour of systems. Further, this methodology also
provides the means to address non-linear systems dy-
3.1 Modelling human perception
namics, which governs many real life phenomena.
The system modelling starts with a qualitative approach. The central point is the modelling of human behaviour and
Through iteration stages, the model is formed with the use real risks, where human perceptions are the most challen-
of causal loops. Therefore, these models are also called ging part. They are often modelled as adaptive learning
causal loop diagrams, which consist of variables, connected processes with exponential smoothing, which is also
with appropriate links. Links have positive polarity if an common in economic models.
increase of causal variable results in an increase of the In adaptive expectations, the rate of change in a
consecutive variable. If the output variable is decreased by perceived value X* is given by a first order, linear
Downloaded by [Northeastern University] at 08:09 03 December 2014

an increase of input variable, the polarity is negative. differential equation dX*/dt ¼ (X-X*)/D, where X is the
Among variables, there is a distinct set called levels or reported input variable (current value), while D is a time
accumulators. Levels play a special role in the system. First, constant, also called adjustment time. It has been proven
they are the source of inertia. Second, they constitute a kind that adaptive expectations frequently outperform other
of primitive memory within the system – an aggregate of methods for forecasting (Makridakis et al. 1986) and they
past events. Third, they serve as absorbers, and decouple will therefore be included in our modelling process.
inflows from outflows (flows or rates are another important A reader should note that belief is modelled as a stock,
kind of variable, which are coupled with levels). Thus when because perception is a state of mind that remains
a certain variable has the above-mentioned properties, it unchanged unless there is a reason for a change – in case
has to be addressed explicitly as a level. of adaptive expectations this change is the discrepancy
Having a causal loop diagram, one gets a holistic view of between real and perceived value.
the system, which enables a better understanding of the
basic principles of its functioning. Such a model can be
3.2 Threats modelling
further developed, or better, enhanced into a quantitative
one. This means that concrete relationships between 3.2.1 Ordinary users and threats. Regarding threats, the
variables are established by the introduction of appropriate generic model (threats generator) is shown in figure 1. The
equations. The equations often have to contain translation explanation is as follows. Threats actually appear at a
parameters or scaling factors to tune the system so that it certain rate, and once they have appeared, they remain
behaves in a way that real input data closely match values on the scene for a certain period of time. Changes in
of the real output data. technology cause threats to cease sooner or later, but more
frequently, improvements of technology (e.g. software
patches) or enhanced operating procedures eliminate them.
3. Derivation of the model
In every case, new threats are added to existing ones and
As described above, business dynamics methodology is of a the rate of newly generated threats is decoupled from the
general applicability for socio-technical systems. IS cer-
tainly fall into this domain and it makes sense to think
about them in a systemic way when designing and
implementing such systems (Dutta 2003). Additionally, IS
security with an emphasis on human factors is an even
more appropriate area for the use of business dynamics,
although its use in this particular field is rather new and
rare – an example of a pioneering attempt is given in
Gonzalez 2003.
In the next section we will present a model for security
policy management with an emphasis on human behaviour.
The model includes essential variables, such as risks,
perception of risks (i.e. adaptation processes), intensity of Figure 1. Generic model of threats.
Information systems security and human behaviour 115

rate of risk extinction. This means that, qualitatively


3.3 Human behaviour-based IS security model and its
speaking, threats are to be modelled as accumulators.
explanation
In our case it will be assumed that a fixed number of
risks is generated at the beginning of each week. It is To address security policy management one has to start with
further assumed that attackers that are behind those risks threats (TR) that are in the heart of the model as discussed
(bug finders) have limited capabilities, i.e. that people above. The second basic variable is perceived threats (PT),
perform attacks at a constant rate, which means that they where the discrepancy between it and TR drives the rate
successfully attack a certain number of systems per week. of adaptation, i.e. change in perceived threats (CPT).
This situation lasts for a certain time, when a patch is Certainly, internal accident frequency (IAF) is influenced
released and it is assumed that all organisations install this by PT – the higher the perception of threats, the more
patch immediately. Risks then cease, when another cycle intensive are the efforts of the personnel to block threats.
begins with the set of newly discovered vulnerabilities in The main lever that resides in the hands of the manage-
the software. Thus threat generation rate and threat ment is security policy level (SPL), being independently
extinction rate are modelled with a sequence of pulses, driven by corrective intensity (CI). SPL is another external
where the exact values are determined by the number of factor that drives change in perceived threats (CPT).
attacks. The scaling factor is used for fine adjustment to Additionally, CPT is driven by real adjustment time
real data, while corresponding equations can be found in (RAT), which is further divided into intrinsic adjustment
Downloaded by [Northeastern University] at 08:09 03 December 2014

the appendix. time (IAT) and length of normal operation (LNO). The
reason is that RAT depends on particular circumstances.
3.2.2 IT professionals and threats. Although the above Assume there is a longer period without any accidents.
model of threats will be used as a direct driving factor for Now when a new accident takes place, expectations are
human behaviour in our central model, it should be based on past experiences, which means that people
emphasised that IT professionals have a different view on perceive this event more or less as an isolated one. It is
threats than ordinary users. quite a different situation if one experiences attack after
This view only starts with threats (see figure 2). Threats attack for a longer period of time. In this case, one will
are further elaborated from the perspective of their assume a similar frequency of accidents when the next
probability of realisation with respect to a particular asset. strike takes place, even if this one belongs to the beginning
Further, each asset has to be judged in terms of its value of a period with a smaller frequency of attacks.
and vulnerability to a particular threat. This altogether Last but not least, there are two delays to denote the fact
leads to risks that serve as a basis for countermeasures, that security policy level, as well as discrepancy, is always
where some residual risk has to be taken into account. subject to delayed information. Due to these two variables,
Thus risks actually present the driving force of the central the effects of delays can be analysed. The whole model is
model. presented in figure 3.

Figure 3. Modelling IS security with emphasis on human


Figure 2. From threats to risks. behaviour.
116 D. Trcˇek et al.

The model can also be seen from the loops perspective, Now assume the situation where humans adapt slowly to
where three balancing loops can be noted: changes, which is reflected in high values for RAT (this
value is 10 in figure 6). One would expect similar effects to
1. The upper left loop is the loop of perceived risk (PT, those in figure 5. Indeed, SPL oscillates even more
discrepancy, CPT).
2. There is a trust loop in the upper right corner (PT,
IAF, LNO, RAT, CPT). It represents the trust of
humans (employees) in the system according to
experienced operational patterns, which influence
RAT.
3. The last one is the adjustment loop (PT, IAF, SPL,
CPR). It is positioned at the bottom and it models
adjustment of perceived risk caused by the manage-
ment through SPL.

4. Simulating and analysing IS security


Downloaded by [Northeastern University] at 08:09 03 December 2014

Taking into account lack of experimental data, general


patterns will be presented. In addition, the simulation is, Figure 4. System behaviour with initial values (unit on the
as mentioned, of a deterministic nature. Of course, this horizontal axis is week, and on the vertical axis unit for
does not prevent the reader from doing stochastic-based IAF is accident/week, for PT accident/week, for RAT
analysis, if needed. It is certainly feasible to play with week, for SPL task/week, and for TR accident/week).
stochastic variables within business dynamics, but to better
expose fundamental dependencies and patterns for more
transparent recognition of decisions on IS security, it is
desirable to start driving the model with a deterministic
input.
According to the explanations in the previous section,
the generator of the whole behaviour is assumed to be
threats that are modelled with a narrow pulse sequence (a
rough approximation for a series of Dirac delta func-
tions). The complete period for all simulations that follow
is 30 weeks. Attacks appear every second week with a
frequency of one per week. In the initial model, IPR is
set to zero, while all other variables are set to 1. The
simulation reveals that, according to given conditions,
there is a transition period of four weeks before the Figure 5. Patterns obtained by increasing the values of
system reaches equilibrium. This is the time needed for the information delays (tDelay1 and tDelay2).
human factor to adjust PT to real threats. One can also
note an oscillating security policy, which is not a desired
effect, as permanently changing orders lead to resistant
behaviour of employees. The situation is shown in
figure 4.
Let us now study the effects of delayed propagation of
information about successful breaches that form inputs to
discrepancy and SPL. As expected, delays increase the rate
of successful breaches and the system reaches equilibrium
after only approximately 18 weeks. In the meantime, there
are strong oscillations of SPL, which disappear after this
period. The explanation is that the system is not over-
sensitive in terms of reaction, but it ‘waits for a situation to
become more clear’ before reaction takes place. Put another
way, perceived threats become more moderate and dis- Figure 6. Patterns that are a consequence of larger IAT
crepancies are not as large as before. (RAT is divided by 10 to fit on the graph).
Information systems security and human behaviour 117

intensively, due to slow adaptation of PT, which, on the organisational barriers may delay installation of a patch,
other hand, also results in a higher level of successful not to mention motivation factors. Indeed, the results of an
attacks. The security manager can decide to influence analysis related to patch installation delays (Editorial 2002)
the whole situation by increasing corrective intensity support the letter claim (in case of Slapper worm, only 16
and, consequently, SPL. The simulation proves these per cent of administrators installed the patch immediately,
expectations. while in the first week, the a total of 23 per cent installed the
The security manager can further investigate the system patch). Unfortunately, such surveys are still very rare, but
by posing various questions, e.g. ‘Would change of initially will be needed to more accurately deal with our problem.
perceived threats result in better performance under the Similarly it holds true for the influence of SPL on CPT
current worst conditions, given in figure 6?’ The answer is – and LNO regarding IAF. Especially with the former,
yes. It pays off to report to security administrators initial research in other related fields has to be taken into account,
values for threats that are slightly higher than the real ones which addresses behavioural cybernetics (Haims 1998).
(see figure 7). An interesting side-effect of this measure is Further, with relation to proper risk management that
that oscillations in SPL practically disappear. The reader has been reduced to threats in this paper, a suitable
should note that the model is valid only within a certain taxonomy of threats and vulnerabilities should be consid-
range of given variables – in this case, large initial values ered (Carayon 2003).
would cause employees to underestimate threats, if they Regarding system dynamics, one should note that this
Downloaded by [Northeastern University] at 08:09 03 December 2014

discovered afterwards that they were reporting initial methodology operates on an aggregates level. This brings
values that were too high. some limitations, especially for SPL modelling. It is hard to
The basic model discussed so far can be further enhanced assume that in the near future security policy management
to reflect, for example, fatigue effects due to high SPL or will be defined to an extent, which will enable uniform
the fact that SPL is adjusted in a discrete manner. treatment in terms of security policy levels. Nevertheless,
Nevertheless, it can be seen how such models gradually ISO 17799 presents the first steps in this direction.
evolve to their mature state, how they provide rich insight
into security of IS and how they can give a valuable
feedback for setting up a security policy.
6. Conclusions
It has become evident during recent years that technology
5. Current limitations of the model and further research
alone will not and cannot provide adequate security of
directions
information systems. The main and most important factor
The part of the model that includes RAT, CPR, PT and behind ensuring security is humans. In each and every IS
discrepancy is based on scientifically sound research that there is a complex interplay between technology and the
was already mentioned above (Makridakis et al. 1984). human factor. Therefore, appropriate methods should be
However, the basic premise of the current model is that available to arm those responsible for security to address
breaches should decline, if threats are perceived in a timely these issues more rigorously.
manner and punctually regarding their quantity and It has been explained in this paper why and how business
nature. This assumption is not likely to hold true in the dynamics can be used to address these issues. The paper
majority of cases. Even if one correctly perceives threats, provides an approach that is intended to serve as a template
model for managing security-related issues in real ISs. It
identifies the basic variables and their relationships.
Further, certain sub-parts of this model already have a
sound basis, while those that need further research, are
clearly defined. Once the research has been done that is
needed to quantitatively manage the latter sub-parts, the
complete validation of the model will be done. But even at
this stage, the model may be used as an interactive and
visual tool in security awareness programs to roughly
demonstrate the complex interplay between technology and
organisational issues.
Summing up – proper handling of complex issues
discussed in this paper is certainly hard and should not
be left to the intuition of participating parties. There are
Figure 7. Effects of increased IPT (RAT is divided by 10 to many other important issues that such an approach can
fit on the graph). bring to light, for example oscillations. In our case,
118 D. Trcˇek et al.

oscillations are originally caused by the nature of threats. been simulated with VensimTM package with the simulation
But one should note that even if the original functions are increment set to 0.015625 weeks:
not of an oscillating nature, oscillations can appear due to
delays in the system. Oscillations are undesirable effects, as . adjustment weight ¼ 1, dimensionless
they require additional efforts to drive the system in a . change in perceived threats ¼ (discrepancy þ correc-
certain direction, and they are irritating for the employees tive efficiency*security policy level)/real adjustment
since existing decisions are quickly superceded by new time, units accident/(week*week)
decisions that are of the opposite nature. . corrective efficiency ¼ 1, units accident/task
Certainly, business dynamics is not a panacea, but it can . corrective intensity ¼ 1, units task/accident
be a valuable tool for improving IS security, whether it is . discrepancy ¼ SMOOTH3 (threats, tDelay2)-per-
used as a stand-alone or as a complementary tool to other ceived threats, units accident/week
techniques and approaches. . initially perceived threats ¼ 0, units accident/week
. internal accident frequency ¼ MAX ((threats-
perceived threats), 0), units accident/week
References
. intrinsic adjustment time ¼ 1, units week
BRITISH STANDARDS INSTITUTE, 1999, Code of practice for information . length of normal operation ¼ normal period para-
security management, BS 7799, London.
meter/(1 þ internal accident frequency), units week
Downloaded by [Northeastern University] at 08:09 03 December 2014

BURROWS, M., ABADI, M. and NEEDHAM, R.M., 1990, A Logic of


Authentication. ACM Transactions on Computer Systems, 8, February,
. normal period parameter ¼ 1, units accident
pp. 18 – 36. . number of attacks ¼ 1, units accident/week
CARAYON, P. and KRAEMER, S., 2003, Human Factors in E-security: The . perceived threats ¼ INTEG (change in perceived
business viewpoint (University of Wisconsin-Madison, Madison, WI: threats, initially perceived threats), units accident/
Centre for Quality and Productivity and Improvement), No. Technical
week
Report 184, November.
DENNING, D.E., 1982, Cryptography and Data Security (Reading, MA:
. real adjustment time ¼ intrinsic adjustment time þ
Addison-Wesley). (length of normal operation/adjustment weight),
DENNING, D.E., 1999, Information Warfare and Security (New York, NY: units week
Addison Wesley). . scaling factor ¼ 0.1, units week
DUTTA, A., 2003, Managing IT with systems thinking. IEEE Computer, 36,
. security policy level ¼ corrective intensity*
March, pp. 96 – 97.
EDITORIAL, 2002, System administrators patch too late. Network Security,
(SMOOTH3 (threats, tDelay1) þ internal accident
12, p. 3. frequency), units task/week
GONZALEZ, J.J. (Ed.), 2003, From modeling to managing security, Research . tDelay1 ¼ 1, units week
series no. 35, Hoyskole Forlegat AS, Kristiansand, 2003. . tDelay2 ¼ 1, units week
HAIMS, C.M. and CARAYON, P., 1998, Theory and practice for the
. threat extinction rate ¼ (number of attacks/scaling
implementation of ‘in-house’ continuous improvement participatory
ergonomic programs. Applied Ergonomics, 29, December, pp. 461 – 472.
factor)*PULSE TRAIN (1, scaling factor, 2, 50),
MAKRIDAKIS, S., ANDERSEN, A., CARBONE, R., FILDES, R., HIBON, M., units accident/(week*week)
LEWANDOWSKI, R., NEWTON, J., PARZEN, E. and WINKLER, R., 1984, The . threat generation rate ¼ (number of attacks/scaling
Forecasting Accuracy of Major Time Series Methods (New York: John factor)*PULSE TRAIN (0, scaling factor, 2, 50),
Wiley & Sons).
units accident/(week*week)
SCHNEIER, B., 1996, Applied Cryptography (New York: John Wiley & Sons).
SCHNEIER, B., 2000, Secrets and Lies. Digital Security in a Networked World
. threats ¼ INTEG (threat generation rate-threat ex-
(New York: John Wiley & Sons). tinction rate, 0), units accident/week.
SPIVEY, J.M., 1989, The Z Notation: A Reference Manual (London:
Prentice-Hall). The SMOOTH3 function returns a 3rd order smooth of the
STERMAN, J.D., 2000, Business Dynamics (Boston, MA: McGraw Hill).
input (it is intended for information delays), MAX function
TRČEK, D., 2003, An integral framework for information systems security
management. Computers & Security, 22, June, pp. 337 – 359.
returns the larger of the two values, PULSE TRAIN
US DEPARTMENT OF DEFENSE, 1983, Trusted Computer System Evaluation returns 1, starting at time START, and lasting for interval
Criteria, DoD standard CSC-STD-00l-83. WIDTH and then repeats this pattern every TBETWEEN
time (0 is returned at all other times). Finally, INTEG
means integration.
Appendix
This appendix gives a complete list of equations that are
used in simulations described in this paper. The model has

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy