Safety Science: Jean Christophe Le Coze
Safety Science: Jean Christophe Le Coze
Safety Science: Jean Christophe Le Coze
Safety Science
journal homepage: www.elsevier.com/locate/ssci
a r t i c l e i n f o a b s t r a c t
Article history: High Reliability Organisation (HRO) and Resilience Engineering (RE) are two research traditions which
Received 29 May 2015 have attracted a wide and diverse readership in the past decade. Both have reached the status of central
Received in revised form 14 February 2016 contributions to the field of safety while sharing a similar orientation. This is not without creating
Accepted 8 April 2016
tensions or questions, as expressed in the call of this special issue. The contention of this article is that
Available online xxxx
these two schools introduce ways of approaching safety which need to be reflected upon in order to avoid
simplifications and hasty judgments about their relative strength, weaknesses or degree of overlapping.
Keywords:
HRO has gained strength and legitimacy from (1) studying ethnographically, with an organisational
High reliability organisation
Resilience engineering
angle, high-risk systems, (2) debating about principles producing organisation reliability in face of high
Research traditions complexity and (3) conceptualising some of these principles into a successful generic model of ‘‘collective
Complementarities mindfulness”, with both practical and theoretical success. RE has gained strength and legitimacy from (1)
Nuances harnessing then deconstructing, empirically and theoretically, the notion of ‘human error’, (2) argued for
a system (and complexity) view and discourse about safety/accidents, (3) and supported this view with
the help of (graphical) actionable models and methods (i.e. the engineering orientation). In order to
show this, one has to go beyond the past 10 years of RE to include a longer time frame going back to
the 80 s to the early days of Cognitive Engineering (CE). The approach that is followed here includes
therefore a strong historical orientation as a way to better understand the present situation, profile each
school, promote complementarities while maintaining nuances.
Ó 2016 Published by Elsevier Ltd.
1. Introduction Regulations, civil society, nation states, private and public organi-
sations, labour, workforce and new technological capabilities have
Safety can be an intriguing topic for researchers, practitioners indeed evolved to shape new operating constraints. Keeping in
and outsiders. As much as for the study of ‘science’, it is mind the nuances, namely understanding the degree of overlap-
approached by an array of disciplines with very diverse orienta- ping but also nature and reasons of differences anchored in disci-
tions, and across industries (Le Coze et al., 2014). From an plines and research traditions, help keep track and maintain a
academic point of view, if one uses publications and births of broad sense of the transformations which affect several layers or
journals as an indication, safety as a scholarship topic is not a very dimensions of safety critical operations. Rich and interesting pic-
old endeavour (e.g. Hale, 2014). It is about 30–40 years old. Reflect- tures start to take shape when keeping these nuances in mind.
ing on its origins and journey is today arguably justified by the fact High Reliability Organisation (HRO) and Resilience Engineering
that companies, states, civil societies and researchers have been (RE) are two of these research traditions rooted in different
facing in the past years major changes revolutionising our lives: histories, disciplines, networks and intellectual styles promoted
globalisations processes including accelerated technological devel- by authors from US and Europe. These two schools attracted a wide
opments coupled with new ecological awareness of our relation- and diverse readership and both reached the status of central
ship with nature which trigger needs for a renewal of our contributions to the field of safety. But it is not without creating
established worldviews (Atlan and Pol Droit, 2014). tensions or questions. RE is introduced as a 10 years old movement
The world has indeed been evolving in the past thirty years and seems to promote similar orientation than HRO which is
under processes of globalisation and ecological awareness. This 30 years old, so, what’s new here? What are their differences?
has not been without creating new contexts for high-risk systems. Why using new vocabulary for tackling similar phenomena?
In this article, I come back on thirty years of development
E-mail address: jean-christophe.lecoze@ineris.fr indicating key authors and key contributions on both side of HRO
http://dx.doi.org/10.1016/j.ssci.2016.04.006
0925-7535/Ó 2016 Published by Elsevier Ltd.
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
2 J.C. Le Coze / Safety Science xxx (2016) xxx–xxx
and RE. It is interesting to note that what is now explored here 30 years. This is clear when proceeding historically, but, because the
with the special issue of Safety Science to confront the two schools two stories are rarely put together, it is not always obvious to
could have taken place earlier. The two schools have acknowledged researchers.
each other very early on in the 80 s then subsequently. It is how- The reason for this is specialisation. One knows very well his or
ever only in the current context that more explicit explorations her field (e.g. cognitive engineering or organisational theory) but
seem to be required. not always very much beyond. Socialisation within the paradigms
I want to warn immediately that any historical retrospective of disciplines, Universities, journals, publications and cognitive
such as this one, using only a restricted amount of space, entail costs of moving from one field to another are elements of explana-
at least two types of simplifications of the past. First, it has to tion. The main purpose of this article is therefore to overcome this
restrict, in the text, to a certain degree the level of depth of the situation and lay out very clearly the main features of these two
empirical, methodological and conceptual developments in each histories. This first step provides therefore the background and
school. Second, it must also restrict the presentation of the range material for the second section which consists in analysing and
of intellectual sensitivities between the selected authors within comparing some selected aspects of the two schools as well as
the ‘boundaries’ of two research traditions such as HRO and RE. their relationships. A final section elaborates further on the con-
These two schools include indeed a range of authors who, despite ceptual, practical and axiological diversity of the two schools.
being grouped under the banner of one camp, HRO or RE, have
slightly different interests and theoretical orientations.
La Porte, Roberts, Rochlin and Weick are names associated with 2. Histories
the HRO history, each with slightly different approaches and inter-
ests because of their backgrounds (e.g. social psychology, political 2.1. High reliability organisations
science, organisational psychology, history). Rasmussen, Reason,
Woods, Hollnagel Leveson or Dekker are main authors associated 2.1.1. Origins, initial empirical and theoretical insights
with the development of RE, and here again, they have each their The story of the HRO school is now fairly well known in the
own stances (e.g. psychology, ergonomics, cognitive engineering, safety literature. La Porte, a political scientist (La Porte, 1975), pro-
system safety). This article does not delve into this level of detail duced the core research program in a chapter by formulating the
because of space constraints although the reader needs to keep this problem of ‘nearly error free’ operations (La Porte, 1982), in a book
in mind. edited by social scientists following TMI (1979). Together with
Despite this diversity of authors and specificities, I want never- Roberts, an organisational psychologist (Roberts et al., 1978) and
theless to characterise what bind them in both research traditions Rochlin, a physicist with a historical and social approach to tech-
in order to reach the level of generality needed to better delineate nology (Rochlin, 1974), they spent few years, from the mid 80 s
two different orientations, one, RE, rooted in the movement of onwards, carrying out ethnographic work to address the theoreti-
engineering, human factors, ergonomics and cognitive (system) cal issue of ‘nearly error free’ operations. How can personnel of
engineering, the other, HRO, in the social sciences, more specifi- high-risk systems, such as nuclear power plants, learn when mis-
cally social psychology, organisational psychology, sociology and takes are not permitted? Whereas trial and error is a welcome
political sciences. Finally, I suggest that it is also appropriate to strategy, desirable for one to progress in many areas of social life
consider looking into the complementarities of the two schools (e.g., education), the consequences of making errors in these safety
for the study of high risk systems. critical artefacts would entail such potential consequences that it
The guiding idea behind this article is that we should celebrate has to be, somehow (that is the question), strictly limited.
diversity in styles, methods, concepts and purposes in the field of How does this ‘‘nearly error-free” requirement translate con-
safety even if it requires at times for authors involved helping out- cretely in daily mode of operations? What are the processes which
siders, scholars or practitioners, to see the bigger picture. I don’t support safe operations? Paul Schulman, a political scientist
see this situation as a sign of a poor status of a scientific ‘discipline’ (Schulman, 1980), Karl Weick, a socio-psychological scientist
but a sign instead of its vitality, hence the Feyerabenesque title (Weick, 1979) but also a bit later Mathilde Bourrier, an organisa-
(Feyerabend, 1975) of this article: ‘‘Vive la diversité!” I would like, tional sociologist, joined the group’s endeavour (Bourrier, 1999,
to end this introduction, to reassert that exploring the relationship 2001). Within 10–15 years during the 80 s and 90 s, a series of
between HRO & RE is an opportunity to reflect upon safety as a communications, chapters, articles, and books, were produced to
practical, scientific and academic field more generally. It opens a delineate the contour of what has become this research tradition.
window of opportunity to investigate the diversity of possible Produced collectively (e.g. Rochlin et al., 1987; La Porte and
approaches to safety, and helps reveal preconceptions that one Consolini, 1991; Roberts, 1993; La Porte and Rochlin, 1996;
needs to be aware of when involved in safety research. Although Roberts and Rousseau, 1989; Grabowsky and Roberts, 1997;
central, RE and HRO are not the only research traditions in the Rochlin, 1996; Grabowski and Roberts, 1999; Weick and Roberts,
field. 1993; Weick et al., 1999; Bourrier, 2001; Weick and Sutcliffe,
For this purpose, I proceed as follows. In a first section, I indicate 2003) or individually (e.g. Roberts, 1989, 1990, 1993; Weick,
main authors, arguments, debates, articles and books produced in 1987, 1989, 1990, 1993; Schulman, 1993a, 1993b; Rochlin, 1989,
each school.1 It is in this section that RE will be explicitly linked 1993, 1999; Bourrier, 1999), these pieces introduced outcomes
to the history of human factors (HF) and cognitive (system) engi- and defined concepts but also sources of tensions and debates.
neering (CE) from which it derives directly. The idea behind this his- The journals where this body of work has been published include
torical approach is to show that the two schools developed in primarily Journal of Contingencies and Crisis Management,
parallel to each other, with a certain degree of autonomy shaped Organisation Science, Public Administration, Administrative
by different orientations and disciplinary backgrounds over the past Science Quaterly, California Review of Management but also, more
recently, Journal of Organisational Behaviour.
One important aspect of this research is that studying empiri-
1
To reassert it, I can only be brief when it comes to introducing concepts given cally operations of high risk systems convinced many of these
space constraints but this special issue of Safety Science certainly requires a
minimum degree of reader’s familiarity with at least one of the two schools. If not,
researchers about the theoretical developments needed to appre-
readers are advised to select some articles indicated below (Table 1), delve into them hend these operations. Aircraft carriers, nuclear power plants and
then come back to this article. air traffic control stood apart from established organisational
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
J.C. Le Coze / Safety Science xxx (2016) xxx–xxx 3
theory (e.g. Scott, 1981) because of their complex modes of operat- organisations including fire fighting, healthcare or petrochemical
ing and unforgiving technological and socio-political contexts. One industry found interest in the properties of high reliability for man-
contention was that what was available in the literature was inad- agerial and regulatory purposes. Combining five processes in the
equate to address the complexities observed, hence the need for ad integrative and conceptualised framework of ‘collective mindful-
hoc developments. ness’, the authors established indeed a generic model with a norma-
This is particularly clear in Roberts (1989, 161) ‘‘For a variety of tive flavour that, beyond empirical domain, appealed to a wide
reasons the organizational literature fails to deal specifically with audience. This side of the HRO tradition is not without triggering
either hazardous organizations or with the subset in that category concerns among researchers who have different axiological postures.
which might be defined as engaging in extremely high levels of perfor- Should researchers also be consultants or remain as far as pos-
mance reliability” (Roberts, 1990). La Porte, Roberts, Rochlin and sible outsiders? Should they be independent observers who should
Weick elaborated this rationale by stressing the strong cognitive remain very cautious about any recipe but also any implications,
and social requirements entailed by these operations. any unintended effects that their work could trigger? Bourrier
A series of concepts, now well known in the field, were estab- aptly refers to this tension about the two different postures ‘‘The
lished to frame and theorise observations for the purpose of under- HRO literature has continued to grow, evolving from a research topic
standing the production of high-reliable operations: ‘self-adaptive to a powerful marketing label (. . .) This was never the intention of the
features of networks’, ‘having the bubble’, ‘heedful interactions’, Berkeley researchers” (Bourrier, 2011, p. 12). This however, never
‘attention to failures and learning’, ‘socialising processes emphasis- turned into an open debate between researchers, at least, openly
ing safety’, etc. They capture these sociocognitive processes estab- in the literature. Of course, I certainly simplify the situation a bit
lished in daily interactions sometimes across hierarchical as, first, HRO researchers were close to managers of the organisa-
structures of organisations. One example is the specific dynamic tions that they were investigating.
and temporal property observed in several different settings (e.g., One finds mentioned a series of workshops held between the
air traffic control, aircraft carrier). Berkeley team and the different managers of the systems studied
Observations indeed show how patterns of interactions depend (Roberts and Rousseau, 1989), and the link with management
on contextual features of situations, triggering reconfiguration of, science exists also early on with Roberts then Weick who are both
for instance formal authorities, to satisfy real time problem solving affiliated to management departments. Second, HRO did not stop
capabilities. These micro–meso layers of analysis were also com- and can’t be reduced to ‘‘collective mindfulness”. There have been
plemented by macro interests about the challenges of both regula- more studies published with first hand empirical data (e.g. Roe and
tory and civil society demands, raising issues of institutional trust Schulman, 2008).3 However, the status of ‘‘collective mindfulness”
and transparency (La Porte and Rochlin, 1996). The approach was with its generic and normative side appealed very much to practi-
clearly multidimensional, reflecting both the diversity of research- tioners, hence its success.
ers’ backgrounds and reliability/safety as an object of investigation. But this practical success (that led HRO in health care success-
One could not introduce HRO without mentioning the central fully, Sutcliffe (2011)), should not hide what is a sustained theoret-
debate about Charles Perrow’s thesis (Perrow, 1984) that tight cou- ical agenda pursued by its promoters and followers with the
pling and high complexity could lead to ‘normal accidents’. If HRO central concept of mindfulness (e.g. Weick and Sutcliffe, 2006;
researchers found little in the organisational literature to interpret Weick and Putman, 2006; Sutcliffe, 2011; Vogus and Sutcliffe,
their empirical findings as explained above, one exception was Per- 2012; Vogus et al., 2014; Weick, 2015). Exploring a view of cogni-
row’s contribution.2 tion through a different theoretical option than, among other, the
But it was a retrospective approach, and was not based on once dominant information processing metaphor (Weick and
empirical studies conducted by the author himself, but secondary Sutcliffe, 2012), mindfulness has become one central feature grasp-
data. It provided nevertheless a background to discuss observa- ing the processes by which one’s mediations with the material and
tions and concepts of the Berkeley group. Despite their reluctance social world are consciously reflected.
to see themselves as the official opponents to the ‘Normal Acci- This constitutes a nexus of intellectual investigations into prop-
dent’ thesis, both Sagan and Perrow argued otherwise (Sagan, erties of high reliability, where, first, a closer link to the field of
1993; Perrow, 1994) and this fed an intense debate echoed in the organisational behaviour (OB) is established (e.g., Waller and
organisational literature more widely (Scott, 2003). Roberts, 2003; Ramanujam and Rousseau, 2006; Goodman et al.,
Although this opposition has simplified the many nuances of 2011), second, a dialogue between Western and Eastern sensitivi-
the HRO tradition (see for instance the proposition of ‘reliability ties on mindfulness is explored (e.g., Weick and Sutcliffe, 2006;
seeking organisations’ by Rochlin (1993)), it also created the condi- Weick and Putman, 2006), and, third, the notion of mindful organ-
tions for delineating a new field of investigation more broadly ising is advocated in relation to organisational mindfulness and
within social sciences (and particularly organisational theory), emotion (e.g. Vogus and Sutcliffe, 2012; Vogus et al., 2014).
with its own empirical, methodological and conceptual issues.
The wave of disasters of the 80 s (Chernobyl, 1986, Challenger,
1986, Bhopal, 1984, Piper Alpha, 1988, etc.) amplified the manage- 2.1.3. Summary of HRO history
rial, social and political relevance of these studies and debates. HRO is a research tradition established in US in the 80 s on the
basis of empirical studies of high risk systems at a time when no
2.1.2. Establishing practically and theoretically the centrality of one had neither observed nor conceptualised these issues, apart
‘‘mindfulness” from the influential ‘normal accident’ thesis of Charles Perrow
In this context, the book from Weick and Sutcliffe (2003) which laid out, based on a retrospective methodology, the back-
following an earlier article in 1999 (Weick et al., 1999) served as ground for subsequent heated debates. Cumulative strategy of
an important platform for these ideas beyond scholars. Operators, articulating together insights gathered from this tradition (and
managers and regulators in a diversity of high-risk or safety critical beyond) led to the model of ‘collective mindfulness’ formulated
first in an article then in a book offering a generic and normative
2
interface to a wider audience.
The fact that Perrow, a leading author in organisational sociology (Perrow, 1970,
1986), had to develop specific analytical lenses for understanding disasters (for an
3
overview see Le Coze, 2015a) only reinforced the idea that these kinds of For a review of HRO related literature between 2001 and 2007, see Roberts
organizations stood apart from what was already available in the literature. (2009).
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
4 J.C. Le Coze / Safety Science xxx (2016) xxx–xxx
This research tradition is rooted in organisational theory and environment. An unkind work environment is then defined by the fact
political sciences. So, HRO has gained strength and legitimacy from that it is not possible for a man to observe and reverse the effects of
(1) studying ethnographically, with an organisational angle, high- inappropriate variations in performance before they lead to unaccept-
risk systems, (2) debating about principles producing organisation able consequences. When the effect of human variability is observable
reliability in face of high complexity and (3) integrating some of and reversible, the definition of error is related to a reference or norm
these principles into a generic model of ‘‘collective mindfulness”, in terms of the successful outcome of the activity” (Rasmussen, 1982).
with both practical and theoretical implications. I now turn to Rasmussen et al. (1987), Rasmussen (1990a), Hollnagel and
the history of Resilience Engineering. Woods (1983), Hollnagel (1993), Woods (1988), Cook and Woods
(1996), Woods and Cook (1999, 2003), Amalberti (1996) and
2.2. Resilience engineering Woods et al. (1994) have been pursuing and refining this decon-
struction of the notion of error in the spirit of the naturalistic
The story of RE is inseparable from cognitive (system) engineer- thread throughout the 90 s. Their contention is that it is as interest-
ing (CE), as much as human factors and (cognitive) ergonomics ing, and probably more efficient, to concentrate on expertise,
(HFE) and system safety engineering (SSE). Many of these fields namely the ability of individuals to cope with complexity, rather
(human factors, system safety engineering) are several decades than on ‘their’ errors (Rasmussen and Lind, 1981; Woods, 1988),
old (mid 20th century) but the field of cognitive engineering is something that was also coined as the ‘reliability of cognition’
about 30 years old. This story is also fairly known for readers and (Hollnagel, 1993).5 Note that it is a perspective that Reason rallied
researchers in the area of safety. Rasmussen, and engineer, is one also later (Reason, 2008).
of its founders throughout the 80 s and 90 s, along threads with Applied and theorised from real case studies in medical, avia-
Reason, Woods and Hollnagel, all psychologists. The core program tion and nuclear fields, this ‘New Look’ of error as it has been
of this school is to be understood in relation to engineering (or described (Woods and Cook, 2003), was then successfully deployed
design), risk assessment purposes and to the topic of human error. in accident investigation contexts (Dekker, 2002, 2004). It is as a
Rasmussen and Reason are the two early leading authors in this consequence essential here to add that these developments have
respect then followed by Hollnagel and Woods. In the 60 s and been produced with the prospect of providing practical solutions
70 s, the use of computers increased demands for recommenda- to engineers in the field of human reliability assessment HRA
tions about design of human–machine interfaces. Rasmussen was (Hollnagel, 1998) as much as for recommendations to designers
a pioneer studying human error and cognition in the prospect of of interfaces and also professional investigators dealing with inter-
establishing these recommendations (Rasmussen, 1969, 1976; pretation of ‘human errors’ (e.g. aviation domain, and pilots/crew
Rasmussen and Jensen, 1974). Reason was also an early researcher errors, healthcare), hence the notion of cognitive ‘engineering’.
on the understanding of errors (Reason and Mycielska, 1982). This proximity with the worlds and issues tackled by engineers
(or investigators) is entirely constitutive to the background of this
2.2.1. Human error: from the ‘Old Look’ to the ‘New Look’ school: producing methods, tools and concepts understandable by
Post TMI (1979), this topic became the centre of attention of an practitioners. The implications of this ‘New Look’ of errors, propos-
international community of researchers mixing engineers, psy- ing the notion of variability instead, has therefore been built,
chologists and cognitive scientists (a NATO conference in 1982 argued and advocated in multiple books, chapters of books and arti-
launched this dynamic, Senders and Moray (1991)). Two things cles in journal like Human Factors, International Journal of Man
are worth mentioning in the bulk of articles and books published Machine Interaction, Reliability Engineering and System Safety,
in the 80 s and 90 s. First, one can distinguish two orientations, a Safety Science, Cognition, Technology and Work, Ergonomics, The-
taxonomic and a naturalistic option (Le Coze, 2015b). The former oretical Issues in Ergonomics, over the past the thirty years.
was established by Reason, and finalised in his seminal book
‘Human error’ (Reason, 1990). The principle consists in allocating
2.2.2. System safety and complexity
‘failure modes’ by discriminating cognitive processes in relation
The deconstruction of the notion of human error by this net-
to type of errors (i.e. slips, lapses, mistakes, violation).
work of authors is therefore the first aspect constitutive of the RE
One consequence in terms of design and prevention is to try to
movement. The second goes in hand with a system view of safety
eliminate errors likely to lead to undesirable consequences. The
and accident complementing this critic. Reason, relying on other
latter option is naturalistic because errors are seen as part of an
authors including Turner (1978) or Perrow (1984), argues for the
‘ecology of action’, namely an adaptive and exploratory side to cog-
notion of latent errors, contrasting sharp and blunt ends. Accidents
nition. In this view, errors are produced but also very often com-
are not produced locally by front line individuals (sharp end), but
pensated by operators in real life situations.4 They are intrinsic to
by ‘errors’ made earlier by higher decision makers (blunt end).
learning and the ability to adapt within specific work constraints.
The well known defence in depth model (later called ‘Swiss
In terms of design recommendation and preventive strategy, inter-
Cheese’) graphically and successfully helps convey this idea
faces should be flexible to provide ample ability to operators or
(Reason, 1990), and then later refined (Reason, 1997).
pilots to deal with this adaptive and exploratory side to cognition.
Rasmussen also develops a system (or one should say ‘complex-
Moreover, errors should not necessarily be the target of decontextu-
ity’) model of safety and accident by translating his findings on
alised retrospective judgment (i.e. hindsight bias) and not necessar-
cognition to organisation, and proposing the concept of migration
ily eliminated given their adaptive properties.
coupled with the principle of ‘defence in depth fallacy’
Errors are indeed highly relative in this latter option: ‘‘to opti-
(Rasmussen, 1997), which will be found applied later through
mize performance, to develop smooth and efficient skills, it is very
the notion of ‘practical drift’ by Snook (Snook, 2000) and
important to have opportunities to perform trial and error experi-
‘resonance’ by Hollnagel (2004a,b). Anchored in adaptive and
ments, and human errors can in a way be considered as unsuccessful
experiments with unacceptable consequences. Typically, they are only
5
classified as human errors because they are performed in an ‘unkind’ These authors were also actors of the Naturalistic Decision Making (NDM)
studies, which were developed in the end of the 80s then in the 90s (Klein et al., 1993;
Klein, 1997). One understands now that the principle which consists in studying daily
4
This approach to cognition for the study of errors confronts scholars to situations as opposed to error (or incident or accidents) is not exclusive to HRO but
methodological and epistemological problems when assessed against the normative also advocated by CE (and NDM) as the same period of times, in the 80s/90s, at a
background of experimental psychology, which requires controlled settings. micro, cognitive, level.
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
J.C. Le Coze / Safety Science xxx (2016) xxx–xxx 5
Table 1
Selection of books and key articles.
High Reliability Organizations (HRO) Cognitive (system) engineering (CE) & Resilience Engineering (RE)
1980
On the design and management of nearly-error free organizational control systems Coping with complexity (Rasmussen and Lind, 1981) – article
(La Porte, 1982) – Book Chapter Human errors. a taxonomy for describing human malfunction in industrial installations.
The self-designing high-reliability organization: aircraft carrier flight operations Rasmussen (1982) – Article
at sea (Rochlin et al., 1987) – Article Absent-minded? The psychology of mental lapses and everyday errors. Reason and
New challenges in organisational research: high reliability organizations Mycielska (1982) – Book
(Roberts, 1989) – Article Coping with complexity: the psychology of human behaviour in complex systems (Woods,
Culture of high reliability (Weick, 1987) – Article 1988) – Chapter
Mental models of high reliability systems (Weick, 1989) – Article Cognitive systems engineering: New wine in new bottles (Hollnagel and Woods, 1983) –
Informal organizational networking as a crisis avoidance strategy: US naval Article
flight operations as a case study (Rochlin, 1989) – Article New technology and Human Error (Rasmussen et al., 1987) – Book
Why do complex organizational systems fail? (Rasmussen and Batstone, 1989) – Report
1990
Working in practice but not in theory: the challenges of ‘‘high reliability The role of error in organizing behaviour; Human Error and the Problem of Causality in
organizations” (La Porte and Consolini, 1991) –Article Analysis of Accidents (Rasmussen, 1990a, 1990b) – Articles
New challenges in understanding organizations (Roberts, 1993) – Book Risk management in a dynamic society: a modelling problem (Rasmussen, 1997) –
Defining ‘high reliability” organizations in practice: a taxonomic prologue Article
(Rochlin, 1993) – Chapter Behind Human Error: cognitive system, computers and hindsight (Woods et al., 1994) –
Mann Gulch disaster: The collapse of sensemaking (Weick, 1993) – Article Book
Heedful interactions (Roberts, Weick, 1993) – Article Human Reliability Analysis: and Control, Cognitive Reliability and Error Analysis Method:
Collective Mindfulness (Weick et al., 1999) – Article CREAM. (Hollnagel, 1993, 1998) – Books
Risk mitigation in virtual organizations (Grabowski and Roberts, 1999) – Human Error (Reason, 1990) – Book
Article Managing the risk of organisational accidents (Reason, 1997) Book
Le nucléaire à l’épreuve de l’organisation (Bourrier, 1999) – Book La conduite des systèmes à risques (Amalberti, 1996) – Book
2000
Organiser la fiabilité (Bourrier, 2001) – Book Risk management in a dynamic society (Rasmussen and Svedung, 2000) – Book
Managing the unexpected (Weick and Sutcliffe, 2003) – Book Barriers and prevention, ETTO, FRAM, Safety I & II (Hollnagel, 2004a,b, 2009, 2012,
Learning from high reliability organisations (Hopkins, 2009) – Book 2014) – Books
High reliability management (Roe and Schulman, 2008) – Book Resilience engineering (Hollnagel et al., 2006) – Book
Mindfulness and the quality of organizational attention (Weick and Sutcliffe, Investigating human error, Ten questions about human errors, Just Culture, Drift into
2006) – Article failure (Dekker, 2002, 2004, 2007, 2011) – Books
Information overload revisited (Weick and Sutcliffe, 2012) The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries (Reason, 2008) –
The affective foundations of high-reliability organizing (Vogus et al., 2014) – Book
Article Engineering a Safer World (Leveson, 2012) – Book
self-organised properties of complex systems as conceptualised by to conceptualise the problem of human error as well as introduce
cybernetics in the 50 s and 60 s (Ashby, 1956), Rasmussen’s model it in the context of producing recommendations to designers
of migration framed for the years to come the underlying assump- of human–computer interface and to engineers performing
tions of RE. His sociotechnical view (Rasmussen, 1997) has also human reliability assessment. It also proved highly relevant to
been a strong source of inspiration for Leveson, and used as a investigators of accidents in various safety critical contexts
graphical support for system safety engineering, and the STAMP (e.g. healthcare, aviation, nuclear) when the implication of the
method (Leveson, 2004, 2012). ‘hindsight bias’ was made explicit in relation to this deconstruction
So, RE rests on a Rasmussen’s view of adaptive entities produc- of human error.
ing self-organised patterns through tradeoffs in a space of The ‘New Look’ error, based on a naturalistic view and extended
resources and constraints (within an envelope), something for- to a system safety/accidents approach initiated by the ground-
malised further with the help of the ‘complex adaptive system breaking contribution of Reason (1990, 1997) and Rasmussen
(CAS)’ language by Woods (2015), and by Dekker relying more on (1990a, 1997), found many refinements in the 90 s and 2000 s
the metaphorical and epistemological side to complexity in rela- by authors who maintained the engineering orientation (e.g.
tion to accidents (Dekker, 2004, 2011). Complexity ideas are also producing practical concepts, tools and methods) for a diversity
explored with strong practical purposes by Hollnagel (2004a,b, of purposes (designing, assessing, investigating).
2009, 2012, 2014) and the FRAM method. These refinements could rely more and more on ideas coming
In this respect, it is worth noting that Hollnagel’s cycle of pro- from the field of complexity which increased popularity in the
ducing concepts coupled with the development of tools and meth- 90 s and 2000 s (e.g. Waldrop, 1992; Lewin, 1992; Mitchell,
ods is consistent over 30 years of research (Hollnagel, 1993, 1998, 2009). So, RE has gained strength and legitimacy from (1) harness-
2004a,b, 2009, 2012, 2014), illustrating perfectly the engineering ing then deconstructing, empirically and theoretically, the notion
orientation of this school.6 The RE book in 2006 (Hollnagel et al., of ‘human error’, (2) argued for a system (and complexity) view
2006), and subsequent books (e.g. Hollnagel et al., 2008) are there- and discourse about safety/accidents, (3) and supported this view
fore the products of this legacy of human error deconstruction, with the help of (graphical) actionable models or methods (i.e.
system safety (complexity) and engineering orientation. the engineering orientation).
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
6 J.C. Le Coze / Safety Science xxx (2016) xxx–xxx
Table 2 2 illustrate this. Key articles, book chapters and books by a diver-
Disciplines & journals (selection). sity of authors from different backgrounds in different journals
HRO Resilience Engineering were released to frame the issue of safety along the line that have
Disciplines been summarised in the sections above.
Organisational psychology Engineering Thus, when HRO authors were studying aircraft carriers, air
Management science Cognitive (system) engineering traffic control and nuclear power plants in the 80 s, CE authors
Social psychology Ergonomics were conceptualising the problem of error and its implication for
Sociology Psychology
Political science Cybernetics, system & complexity
interfaces design in nuclear, aviation7 and medical contexts. When
System approach science HRO debated about the outcomes of their investigation in relation to
Journals (selection)
the thesis of Normal Accident in the 90 s, CE refined the naturalistic
Journal of Contingencies and Crisis Human Factors side to cognition and expanded safety towards system and complex-
Management Safety Science ity8 views while developing tools and practical guidance to a range
Journal of Public Administration Ergonomics of actors (risk assessment engineers, designers of interfaces, and
Research and Theory Cognition, technology and work
investigators of incidents/accidents).
Organization Science Reliability Engineering and System
Administrative Science Quarterly Safety When the HRO tradition produced the model of ‘‘collective
California Management Review Theoretical Issues in Ergonomics mindfulness” in the 2000 s, cognitive engineers turned to the
Journal of Organizational Behaviour International Journal of Man notion of Resilience Engineering to emphasise further the natural-
Machine Interaction istic and positive side to cognition of these front line actors which
are behind the production of safety.9 The next table (Table 3)
extracts, from the above short histories, the key topics and debates
Table 3 which have been shaping the content and directions of research in
Selected key topics and debates.
the two schools.
HRO Resilience Engineering On the HRO side, topics include ‘nearly error-free’ operations;
Key topics/themes interdependence, redundancy and slack; training, socialisation
‘Nearly error-free’ operations Human machine (computer) interface and culture; collective mindfulness, resilience and sensemaking;
Interdependence, redundancy Human error (including ‘hindsight flexible (self-adapting) structure and networks; institutional trust.
and slack bias’), reliability of cognition &
On the RE side, one can consider the following ones: human
Training, socialisation & culture resilience
Collective Mindfulness & Situation awareness & expertise machine (computer) interface; human error, reliability of cognition,
Sensemaking (Naturalistic Decision Making) resilience; situation awareness and expertise (naturalistic decision
Flexible (self-adapting) structure System Safety & Accident Models making); system safety (& accident) models; adaptation, self organ-
& networks Adaptation, Self organisation & isation & complexity. As it appears in this list, there are overlapping
Institutional trust Complexity
areas in both traditions, without being able to say which one was
Key debates within tradition ‘first’ or would be now more ‘legitimate’ than the other. If they
Normal Accident and HRO Taxonomic/naturalist approach to
Status of high-risk systems in human error
express different angles and styles they however also offer corre-
relation to organisational theory Status of discipline as science or sponding concepts. I now comment two of these obvious cases.
Descriptive or normative posture engineering
of HRO descriptions (covert Studying cognition in real life outside
debate) experimental psychology 3.1.1. Situation awareness – having the bubble
One that comes to mind is ‘‘situation awareness” from the
developments of the CE school (e.g. Woods et al., 1994) and ‘‘hav-
Table 4
ing the bubble” from the HRO tradition (Roberts and Rousseau,
Research traditions’ profiles.
1989). In the field of human factors and cognitive engineering,
HRO RE the notion of ‘‘situation awareness” is at the heart of concrete prac-
Main roots tical methods, tools and concepts to train pilots and assess situa-
Roots in social sciences, Roots in engineering and an tions against the risk of errors. In this perspective, errors are
ethnographic approach and ecological perspective of psychology understood as the products of problems of situation awareness. It
empirical case studies & cognition
characterises this specific moment when an individual fails to
Principal layer of analysis (of empirical and conceptual) interpret adequately circumstances of complex dynamic environ-
Empirical studies and Empirical studies and
conceptualisation based on a mix conceptualisation mainly based on a
ments, and provoke unwanted outcomes.
of micro–meso–macro (micro) cognitive orientation (linked This idea is mirrored by the principle of ‘having the bubble’ in
(interactionist, managerial, social to macro layers through the HRO tradition (Roberts and Rousseau, 1989), which is derived
and political) orientation systemic/complexity lenses) from observations of real life situations. By keeping sight of what’s
Dominant axiological position going on through a constant updating of the big picture,
Driven by descriptive ambition, with Explicitly oriented towards experienced individuals manage to supervise and steer complex
a rather value neutral perspective practitioners (engineers, designers,
coordinated activities across functions during real-time daily oper-
(but this remains a covert debate). front line operators, managers or
Normative component as a investigators). ations. They both, ‘‘situation awareness” and ‘‘having the bubble”,
secondary outcome of a Normative posture as an intrinsic relate to the ability of maintaining an appropriate picture of
consulting market and industry feature of the engineering
demand. orientation. 7
Note that CRM (Crew/Cockpit Resource Management) is derived from human
factor background in the 80s (Flin, 1996; Flin et al., 2008).
8
Of course, HRO works are based on system approach which strongly structured
3.1. Two parallel histories with a fair degree of independence the field of organisational theory in the 60s onward (Scott, 2003), but it has been
conceptualised, in particular complexity, in the two schools differently.
9
Coming back on a history of the two schools had the purpose of In the field of safety, resilience had been used earlier, for instance by Wildaavsky
(1988) and Weick (1993). These authors oppose resilience to anticipation, a definition
providing a background and some material for discussions. First, it that is not incompatible with RE, but only partly overlapping (see overview by Woods,
is interesting to notice that the two schools produced articles, 2015). Resilience in RE shifts the focus from the negative side of error to variability,
books and chapters in parallel for the past 30 years. Tables 1 and adaptation, trade-offs and expertise produced in daily activities.
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
J.C. Le Coze / Safety Science xxx (2016) xxx–xxx 7
situations to perform safe operations, whether this is in the case of interfaces with recommendations or specifications for designers
an aircraft pilot (or crew), or in the case of an aircraft carrier officer is not really studied in HRO. This derives from the level of analysis,
on deck in charge of keeping sight of coordination of tasks among a the purpose as well as the disciplines of origins. This is not without
diversity of individuals. These two are actually grouped under the implications. A psychologist or a cognitive scientist is not a sociol-
category of ‘‘sensitivity to operations” in Weick et al. (1999) as one ogist. Their knowledge differs. This is an obvious statement but one
process out of 5 in the model of ‘‘collective mindfulness”. with real implications worth pondering. Let’s illustrate.
A cognitive psychologist has been socialised through his studies
3.1.2. Self-organising – self-designing – self-adapting and research to tackle a range of different topics as for instance
Another example is the self-organised or self-designing features memory, attention, perception, language, intelligence, problem
of high-risk systems that are recognised very early on in both tra- resolution or emotion. In contrast, a trained sociologist has studied
ditions as an important feature to grasp. In CE, it derives from an socialisation, inequalities, organisation, labour relations, family,
empirical and conceptual investigation at a micro level gender, social stratification or movements or the media. One does
(Rasmussen and Jensen, 1974; Rasmussen, 1976) turned into a not conceptualise and observe brain processes of perception as one
macro one (Rasmussen, 1990a, 1990b) and in HRO, it is inferred does study labour relations or social movements. Perceptions,
from fieldwork describing dynamics of informal networks labour relations or social movements are nevertheless potentially
(Rochlin et al., 1987; Rochlin, 1989). This recognition of the impor- relevant to safety research, with different angles of analysis.
tance of self-organisation is made explicit in the following quote In general, methodological, empirical, theoretical and epistemo-
from Rasmussen when bridging the two traditions back in the logical issues and backgrounds are interwoven within the para-
80 s, in 1989. digms of scientific disciplines which frame specific objects. As a
result, and from an epistemological (or ontological) point of view,
‘‘Rochlin characterises the relationships between technology and
some orientations in cognitive sciences rely for instance on
organisations with respect to complexity, error, and risk against
assumptions about human nature, the architecture of the brain
the background of the influential studies of the Berkeley group of
and relationship with biology, assumptions that the social sciences
the evolution of the high-reliability organisation of an American
might not share.
aircraft carrier. Even if the context is that of a social science study,
For many social scientists, understanding individuals implies
the notions used to analyse the organisation in evolutionary and
certain assumptions regarding the unique aspects of our symbolic
‘self-designing’ terms often mirrors concepts of cybernetic theories
world that questions reductionist (e.g. biological) ambitions that
of self-organisation (e.g. Ashby’s requisite variety)”
some cognitive scientists might advocate to conceptualise the
[Rasmussen and Batstone (1989)]
social (as the prefix ‘neuro’ translates in some recent trends, e.g.,
‘‘neurosociology”). However, a vast majority of sociologists have
A very good example of this today is Roe and Schulman HRO no need and/or interest in the biology of the brain.
empirical description of self-adaptive cognitive dynamics of what Conversely, a notion like power is not so much part of the con-
is described as ‘‘reliability professionals” (Roe and Schulman, ceptual background of cognitive psychologists although for social
2008) which is then combined, later, with RE influences by sciences it would be difficult to consider the understanding of
Patterson and Wears (2014). In the two schools, notion of ‘self’ organisation or society without explicit reference to this concept.
whether ‘organised’, ‘adaptive’ or ‘designing’ systems can be both Thus, despite some overlapping interests, Table 4 characterises
positive and negative (in the ‘‘collective mindfulness” model, it what could be called the ‘intellectual orientations’ of the two tradi-
was originally covered by the notion of ‘underspecification of tions, including main roots, principal layer of analysis, dominant
structures’ which became ‘deference to expertise’, Weick et al. axiological position.
(1999). Positive when it promotes flexible responses in daily oper- These distinctions between the schools bring the nuances
ations and to critical situations (e.g. resilience), negative when it needed to avoid simplifications. HRO is based on the body of
generates unexpected patterns leading to unwanted events (e.g. knowledge developed in the social sciences whereas RE has a
defence in depth fallacy, drift, resonance). stronger link to engineering and cognitive sciences. This difference
was reflected in the kind of debates triggered in the two traditions
3.1.3. Exchanges between the two traditions over the years (Table 3). In HRO, one issue was to question rela-
Sometimes, concepts are also borrowed from one tradition to tionship with existing organisational theory whereas for CE one
the other in order to serve argument. Weick’s cognitive approach issue was the possibility of studying cognition in a scientific man-
through constructivist, sensemaking and interactionist lenses ner without relying on the principles of experimental psychology.
appealed to CE and RE authors with an interest for retrospective But, in my view, this diversity is welcome.
account of events, as for Dekker (2004). He found in Weick a sup-
port for an alternative view to information processing applied to 3.2. Vive la diversité!
cognition. In the context of accident investigation, this proved
highly relevant (Dekker, 2002, 2004). As introduced in this article, safety is an intriguing research
Thus, because of this proximity of interests, explicit references topic because it is possible to investigate it from the perspective
between traditions are often found. In this respect, the cognitive of a range of strategies whether mono, multi or interdisciplinary,
side to HRO (with the concept of mindfulness) and RE create whether descriptive or normative (e.g. engineering), etc. RE and
indeed strong links between the two traditions (e.g., the indica- HRO are very good illustrations of this, both being rather interdis-
tions above about the model of ‘‘collective mindfulness”). How- ciplinary but combining different disciplines, both being involved
ever, it remains important to stress nuances, even when two in high-risk systems but with slightly different purposes, both
cognitive approaches are advocated. But in order to do so and come being empirical and conceptual but with distinct intellectual orien-
back to this issue, it is worth turning to differences, because, of tations. In this respect, to move from C(S)E to RE in the mid 2000 s
course, there are also differences. (Hollnagel et al., 2006) has created ambiguities because it seemed
to promote something new whereas it can also be seen in fact as a
3.1.4. Differences between the research traditions continuation of thirty years of work in cognitive (system) engi-
For instance, socialisation, culture or institutional trust are neering based on a solid body of knowledge established through
topics which are not explored in CE and RE, and human–machine empirical, theoretical and practical research in the field of safety
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
8 J.C. Le Coze / Safety Science xxx (2016) xxx–xxx
(Tables 1–3). And, although there are overlapping domains At the same time, by engaging with both the psycho-cognitive
between HRO and RE (e.g., ‘situation awareness/having the bub- and social dimensions of mindfulness, by introducing affects and
ble’; ‘self organised properties’), there are also differences. In this emotions and comparing Western and Eastern approaches, the
final section, I discuss both conceptual and axiological dimensions mindfulness idea from HRO opens alternative paths to our grasp
of the two schools to illustrate the importance of keeping these of operational situations in safety critical organisations.
nuances in mind. I conclude with indicating current and future
trends of hybridisation and interdisciplinarity. 3.2.2. Axiological diversity
This example also helps illustrate nuances of axiological pos-
3.2.1. Conceptual diversity tures between HRO and RE. The engineering orientation of CE/RE
Stressing nuances between traditions is the reason why the title in this respect is explicitly and well captured in the following
of this article, ‘‘Vive la diversité!”, explicitly supports the idea that quote. ‘‘A model that is cumbersome and costly to use will from the
maintaining diversity visible is needed because the world is (epis- very start be at disadvantage, even if it from an academic point of view
temologically) complex. Anyone engaged in ethnographic work of provides a better explanation. The trick is therefore to find a model
daily operations of high-risk systems has something to gain from that at the same time is so simple that it can be used without engen-
knowing about CE and RE contribution on human machine (com- dering problems or requiring too much specialised knowledge, yet
puter) interfaces; human error, reliability of cognition, resilience; powerful enough to go beneath the often deceptive surface descrip-
situation awareness and expertise, system safety (& accident) tions (. . .) The consequence is rather that we should acknowledge
models; adaptation, self organisation & complexity; AND from the simplifications that the model brings, and carefully weigh advan-
HRO with redundancy and slack; training, socialisation and cul- tages against disadvantages so that a choice of model is made know-
ture; collective mindfulness, resilience and sensemaking; flexible ingly” (Hollnagel et al., 2006, 245).
(self-adapting) structure and networks; institutional trust. It is also worth indicating that because of this orientation, many
They offer a continuum of interest from micro to macro situa- ideas from RE do not always in fact strike as very new for readers
tions to be articulated in order to grasp the complex nature of knowledgeable in safety intellectual production and beyond. It
sociotechnological systems. They introduce a range of disciplines sometimes seems to reformulate already existing ideas which are
which needs to be considered when conceptualising safety. As a borrowed without acknowledgment or without clearly relating
researcher, I have personally valued this diversity of perspectives, these ideas to similar existing one for reasons that have nothing
even on similar topics. It proves heuristically powerful, theoreti- to do with ‘pure’ scientific purposes, e.g., ego, networks of
cally inspiring and practically useful. The message is that comple- researchers (Hopkins, 2014). But, in my opinion, what should be
mentarities exist but that diversity of assumptions and appreciated is not novelty in the academic sense of the term, but
preconceptions embedded in the history and foundations of disci- instead a novelty in the ability to be easily understandable to prac-
plines need to be kept in mind to avoid the risk of conflating them. titioners in order to help change both mindsets and practices.
Let’s come back now on the cognitive orientations of the two tra- ‘‘This book is intended for practitioners rather than researchers
ditions to illustrate this point. It is probably at this level of descrip- (. . .) The reason for this partiality is simple – it is the practitioner
tion that the principle of maintaining nuances of each school’s who can makes changes to practice, not the academic. The intention
profile (Table 4) is the most interesting. The deconstruction of ‘‘hu- is that the practically minded reader should be able to read the book
man errors” by cognitive psychology and cognitive engineering has without constantly consulting the references and even without caring
provided invaluable material to understand this phenomena and much about them” (Hollnagel, 2004a,b, xiii). There is also this sen-
interact with industry. Relying on knowledge about cognition as tence translating the axiological difference ‘‘resilience engineering is
conceptualised in the 80 s and 90 s through processes of perception, the action program of high reliability organisation” (Dekker and
memory, attention, decision making associated into models based Woods, 2010). It remains, of course, an open (and difficult) ques-
partly on an analogy with computers, researchers could convinc- tion to know how successful these attempts are.
ingly argue about the naturalness of errors. As a consequence, for a researcher interested in empirical and
It allowed immense improvements in the way individuals (e.g., analytical insights without the constraint of making them appeal-
aircraft pilots, nurses, doctors) perceived their own and other contri- ing and useful to specific practitioners, this is not necessarily
butions to incidents and accidents but also in the way prevention of important. What is the ability to produce knowledge about phe-
errors could be developed.10 This was done within disciplinary nomena with the help of scientific arguments grounded in the
boundaries and orientation of CE and RE as indicated above (Table 4). mastery of a discipline (e.g. political sciences) or several, and
For instance, distinguishing categories of errors as resulting from debated in scientific journals. For instance, to approach meso and
automatic or reflexive processes is key to many operational situations, macro situations of high-risk systems, the HRO tradition is very
and constitutes important input to crew resource management (CRM) appropriate, and linked to disciplines (e.g. sociology of organisa-
programs, then towards a better appreciation of the reliability of cog- tion, political sciences) which help conceptualise these layers of
nition (Hollnagel, 1993) and individuals’ expertise (Klein, 1997). analysis in a way that a cognitive view does not.11
HRO did not operate so explicitly this deconstruction of the When writing this, it becomes clear that the respective value of
original common sense idea on ‘‘human error”, but is strong in the two traditions depends on the problems, level of analysis and
other areas by exploring mindfulness, which derives from other purposes of the researcher and, more generally, the users. Develop-
roots in social psychology and philosophy (Weick, 1979, 1995) ments needed to support the range of practitioners in their daily
than mainstream cognitive science relying on computer analogy operations are different from developments needed to conceptu-
and different assumption of the mind or decision making. Compare alise and compare high-risk systems through in depth ethnographic
for instance, although not incompatible, the two treatments of studies; although it seems reasonable to think that the two are
information overload by Hollnagel (1992) and Weick and needed. In this respect, although complementarities have been
Sutcliffe (2012) which express different intellectual sensitivities. indicated between CE/RE and HRO in this article, it is also worth
not forgetting about the disciplinary roots and profiles of each
10
school.
It is in this respect similar to Gardner’s psychological theory of multiple
intelligences in the sense that it helped tremendously people see themselves
11
differently, and engage teachers in new directions when it comes to education In RE, the move from micro to macro is theorised through complexity discourse,
programs and understanding children’s learning abilities (Gardner, 2011). but it is, as far as know, not always empirically illustrated (e.g. Dekker, 2011).
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
J.C. Le Coze / Safety Science xxx (2016) xxx–xxx 9
3.2.3. Hybridisation, interdisciplinarity . . . and nuances Bourrier, M., 1999. Le nucléaire à l’épreuve de l’organisation. Presses Universitaires
de France (Nuclear Industry from an Organisational Point of View).
If one considers, as introduced in this article, safety to be the
Bourrier, M. (Ed.), 2001. Organiser la fiabilité. L’Harmattan, Paris (Organising
product of now globalised complex sociotechnological systems, reliability).
then strategies of companies resulting from managerial decision Cook, R.I., Woods, D., 1996. Operating at the sharp end: the complexity of human
making processes, political contexts including regulation, control error. In: Sue Borgner, M. (Ed.), Human Error in Medecine. CRC Press, pp. 255–
310.
authorities and civil society activism, organisational structures Dekker, S., 2002. The Field Guide to Human Error Investigation. Ashgate, Burlington,
and engineering design of technology including choices of automa- VT.
tions and interfaces, real contexts of operators activities and tasks Dekker, S., 2004. Ten questions about human errors. A New View of Human Factors
and System Safety (Human Factors in Transportation). CRC Press.
but also contractual relationship between collaborating companies Dekker, S., 2007. Just Culture. Balancing Safety and Accountability. Ashgate.
etc need to be included in the picture. But, these topics are studied Dekker, S., 2011. Drift into Failure. From Hunting Broken Components to
by disciplines as diverse as management science, political science, Understanding Complex Systems. Ashgate.
Dekker, S.W.A., Woods, D.D., 2010. The high reliability organization perspective. In:
sociology of organisation and technology, cognitive engineering Salas, E., Maurino, D. (Eds.), Human Factors in Aviation, second ed. Wiley, New
and ergonomics, but also engineering and law which require York, pp. 123–146.
strategies of interdisciplinarity (Le Coze, 2016). Feyerabend, P., 1975. Against Method. New Left Books, London.
Flin, R.H., 1996. Sitting in the Hot Seat: Leaders and Teams for Critical Incidents.
In fact, as illustrated above with Patterson and Wears (2014) in Wiley, Chichester.
health care, researchers have already started to combine the two Flin, R., O’Connor, P., Crichton, M., 2008. Safety at the Sharp-End: a Guide to Non
schools discussed in this paper within their own personal develop- Technical Skills. Ashgate, Aldershot.
Gardner, H., 2011. Frames of mind: the theory of multiple intelligences. Basic Books.
ments, sometimes with other influences too. Examples of interdis-
Goodman, P.S., Ramanujam, R., Carroll, J.S., Edmondson, A.C., Hofmann, D.A.,
ciplinarity or hybridisation of this sort, combining both RE and Sutcliffe, K.M., 2011. Organizational errors: directions for future research. Res.
HRO with other scientific domains can be found in Haavik (2010, Organ. Behav. 31, 151–176.
2011, 2014) or Le Coze (2013, 2016) who mobilise the field of Grabowski, M., Roberts, K.H., 1999. Risk mitigation in virtual organizations. Organ.
Sci. 10, 704–721.
social study of science and technology (STS). Future advances and Grabowsky, M., Roberts, R.H., 1997. Risk mitigation in large-scale systems: lessons
research are certainly in these kinds of hybridisations. But, if inter- from high reliability organizations. California Manage. Rev. 39 (4), 152–161.
disciplinarity is a highly popular research incentive and practice Haavik, T., 2010. Making drilling operations visible: the role of articulation work for
organisational safety. Cogn. Technol. Work 12 (4), 285–295.
(Barry and Born, 2013), it should not be performed without the Haavik, T.K., 2011. On components and relations in sociotechnical systems. J.
ability to strongly reflect about nuances before aggregating Contingencies Crisis Manage. 19, 99–109.
insights into unified big pictures. Haavik, T.K., 2014. On the ontology of safety. Saf. Sci. 67, 37–43.
Hale, A.R., 2014. Foundations of safety science, a postscript. Saf. Sci. 67, 64–69.
Hollnagel, E., 1992. Coping, coupling and control: The modelling of muddling
through. In: Invited Paper for 2nd Interdisciplinary Workshop on Mental
4. Conclusion Models, March 23-25, 1992. Robinson College, Cambridge, UK.
Hollnagel, E., 1993. Human Reliability Analysis: Context and Control. Academic
Both RE (with CE) and HRO are post-TMI products, thirty years Press, London.
Hollnagel, E., 1998. Cognitive Reliability and Error Analysis Method (CREAM).
ago. The nuclear accident of 1979 created an incentive for different Elsevier Science Ltd, Oxford.
communities of researchers to explore its dimensions through Hollnagel, E., 2004a. Barriers and Prevention. Ashgate, Aldershot, UK.
engineering, cognitive, psychological or social sciences, and the rel- Hollnagel, E., 2004b. Automation and Human Work. In: Sandom, C.H. (Ed.), Human
Factors for Engineers. The Institution of Electrical Engineers, London.
evance of this endeavour was reinforced by a well known series of
Hollnagel, E., 2009. The ETTO Principle: Why Things That go Right Sometimes go
disasters in the 80 s among which Challenger, Bhopal, Piper Alpha Wrong. Ashgate, Farnham, UK.
or Chernobyl. Authors with some early interests in related topics in Hollnagel, E., 2012. FRAM: The Functional Resonance Analysis Method. Modelling
Complex Socio-Technical Systems. Ashgate.
the 70 s, such as La Porte about the control of complex social sys-
Hollnagel, E., 2014. Safety-I and Safety-II: The Past and Future of Safety
tems and Rasmussen on human factors in relation to interface Management. Ashgate, Farnham, UK.
design, played key roles in shaping respectively HRO and CE, then Hollnagel, E., Woods, D.D., 1983. Cognitive systems engineering: new wine in new
RE programs. bottles. Int. J. Man Mach. Stud. 18, 583–600.
Hollnagel, E., Woods, D., Leveson, N., 2006. Resilience Engineering. Concepts and
Both programs, carried out, translated, developed, by research- Precepts. Ashgate.
ers and teams of researchers which created active networks Hollnagel, E., Nemeth, C.P., Dekker, S., 2008. Resilience Engineering: Remaining
became leading contributions in the field of safety in the following Sensitive to the Possibility of Failure. Ashgate.
Hopkins, A. (Ed.), 2009. Learning from High Reliability Organisations. CCH, Sydney,
decades. By coming back, with a certain degree of simplifications NSW.
needed for this article, on the histories of these two research tradi- Hopkins, A., 2014. Issues in Safety Science. Saf. Sci. 67, 6–14.
tions, the purpose is to make more explicit the differing underlying Klein, G., 1997. Sources of Power. How People Make Decisions. The MIT Press,
Cambridge, MA.
intellectual orientations and contributions. By defining key articles Klein, G.A., Orasanu, J., Calderwood, R., Zsambok, C.E. (Eds.), 1993. Decision Making
and books, central debates and topics, but also disciplines, roots, in Action: Models and Methods. Ablex Publishing, Norwood, NJ.
scope and axiological orientation, a profile for each school is estab- La Porte, T.R. (Ed.), 1975. Organized Social Complexity: Challenge to Politics and
Policy. Princeton University Press.
lished. It is advocated that such diversity should be kept in mind, La Porte, T.R., 1982. On the design and management of nearly error-free
maintained because of our complex world but also that overlap- organizational control systems. In: Sills, D.L., Wolf, C.P., Shelanskl, V.B. (Eds.),
ping concepts and complementarities, ‘‘hybridisations” through Accident at Three Mile Island: The Human Dimensions. Westview Press,
Boulder Co., pp. 185–200.
interdisciplinarity, should be favoured for ethnographic works that
La Porte, T.R., Consolini, P., 1991. Working in practice but not in theory: theoretical
attempt to grasp the multidimensional nature of safety in current challenges of high reliability organizations. J. Publ. Admin. Res. Theory 1 (1),
and ever evolving complex sociotechnological systems. 19–47.
La Porte, T.R., Rochlin, G.I., 1996. A rejoinder to Perrow. J. Contingencies Crisis
Manage. 2 (4), 221–227.
References Le Coze, J.C., 2013. New models for new times. An anti dualist move. Saf. Sci. 59,
200–218.
Le Coze, J.C., 2015a. 1984–2014. Normal Accident. Was Charles Perrow right for the
Amalberti, A., 1996. La conduite des systèmes à risques. Presses universitaires de
wrong reasons ? J. Contingencies Crisis Manage. 23 (4), 275–286.
France, Paris.
Le Coze, J.C., 2015b. Reflecting on Jens Rasmussen’s legacy, a strong program for a
Ashby, W.R., 1956. Introduction to Cybernetics. Chapman & Hall, London.
hard problem. Saf. Sci. 71, 123–141.
Atlan, M., Pol Droit, R., 2014. Humain. Une enquête philosophique sur ces
Le Coze, J.C., 2016. Trente and d’accidents. Le nouveau visage des risques
révolutions qui changent nos vies. Champ Essais. Paris, Flammarion. [Human.
sociotechnologiques. Octarès, Toulouse.
A philosophical investigation about these revolutions which change our lives].
Le Coze, J.C., Pettersen, K., Reiman, T., 2014. The foundations of safety. Saf. Sci. 67,
Barry, A., Born, G. (Eds.), 2013. Interdisciplinarity. Reconfigurations of the Social and
1–6.
Natural Sciences. Routledge.
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006
10 J.C. Le Coze / Safety Science xxx (2016) xxx–xxx
Leveson, N., 2004. A new accident model for engineering safer systems. Saf. Sci., Roe, E., Schulman, P., 2008. High Reliability Management. Stanford University Press.
237–270. Sagan, S., 1993. The Limits of Safety: Organizations, Accidents, and Nuclear
Leveson, N., 2012. Engineering a Safer World: Systems Thinking Applied to Safety. Weapons. Princeton University Press, Princeton, NJ.
MIT Press. Schulman, P.R., 1980. Large-Scale Policy-Making. Elsevier North-Holland, New York.
Lewin, R., 1992. Complexity: Life at the Edge of Chaos. University of Chicago Press. Schulman, P.R., 1993a. The negotiated order of organizational reliability. Admin.
Mitchell, M., 2009. Complexity. A Guided Tour. Oxford University Press, Oxford. Soc. 25 (3), 353–372.
Patterson, M,D., Wears, R.L., 2014. Resilience and precarious success. To be Schulman, P.R., 1993b. The analysis of high reliability organizations. In: Roberts, K.
published in Reliability engineering and system safety. doi: http://dx.doi.org/ (Ed.), New Challenges to Understanding Organizations. Macmillan Publishing
10.1016/j.ress.2015.03.014. Company, New York, NY.
Perrow, C., 1970. Organizational Analysis. A Sociological View. Tavistock Scott, W.R., 1981. Organization, Rational, Natural and Open Systems. Prentice Hall,
Publications. Upper Saddle River, N.J..
Perrow, C., 1984. Normal Accident. Living in High Risk Technology. Basic Books, NY. Scott, W.R., 2003. Organization, Rational, Natural and Open Systems, fifth ed.
Perrow, C., 1986. Complex Organizations. A Critical Essay, third ed. Mac Graw Hill. Prentice Hall, Upper Saddle River, N.J..
Perrow, C., 1994. The limits of safety: the enhancement of a theory of accidents. J. Senders, J., Moray, N., 1991. Human Error: Cause, Prediction, and Reduction.
Contingencies Crisis Manage. 2, 212–220. Lawrence Erlbaum Associates.
Ramanujam, R., Rousseau, D.M., 2006. The challenges are organizational, not just Snook, S.A., 2000. Friendly fire, the accidental shootdown of US black hawks over
clinical. J. Organ. Behav. 27 (7), 811–827. northern Irak. Princeton University Press.
Rasmussen, J., 1969. Man-machine communication in the light of accident record. Sutcliffe, K.M., 2011. High reliability organizations (HROs). Best Pract. Res. Clin.
In: Presented at International Symposium on Man- Machine Systems, Anaesthesiol. 25, 133–144.
Cambridge, September 8–12. In IEEE Conference Records, 69C58-MMS, vol. 3. Turner, B., 1978. Man-Made Disasters. Wykeham Publications, London.
Rasmussen, J., 1976. Outlines of a hybrid model of the process operator. In: Vogus, T., Sutcliffe, K.M., 2012. Organizational mindfulness and mindful organizing:
Sheridan, T., Johannsen, G. (Eds.), Monitoring Behaviour and Supervisory a reconciliation and path forward. Acad. Manage. Learn. Educ. 11 (4), 722–735.
control. Plenum Press, New York. Vogus, Rothman, Sutcliffe, Weick, 2014. The affective foundations of high-reliability
Rasmussen, J., 1982. Human errors. a taxonomy for describing human malfunction organizing.
in industrial installations. J. Occup. Accid. 3, 311–333. Waldrop, M., 1992. Complexity: The Emerging Science at the Edge of Order and
Rasmussen, J., 1990a. The role of error in organizing behavior. Ergonomics 33, Chaos. Simon & Schuster, New York.
1185–1199. Waller, M.J., Roberts, K.H., 2003. High reliability and organizational behavior: finally
Rasmussen, I., 1990b. Human error and the problem of causality in analysis of the twain must meet. J. Organ. Behav. 24, 813–814.
accidents. Philos. Trans. Roy. SOL. Land. B 327, 449–462. Weick, K., 1979. The Social Psychology of Organizing. Reading. Addison-Wesley
Rasmussen, J., 1997. Risk management in a dynamic society: a modelling problem. Publishing Company, Massachusetts.
Saf. Sci. 27 (2/3), 183–213. Weick, K., 1987. Organizational culture as a source of high reliability. California
Rasmussen, J., Batstone, R., 1989. Why do complex organizational systems fail? The Manage. Rev. 29 (2), 112–127.
world bank policy planning and research staff. Environment working paper. Weick, K.E., 1989. Mental models of high reliability systems. Ind. Crisis Quat. 3,
n_ 20. 127–142.
Rasmussen, J., Jensen, A., 1974. Mental procedures in real life tasks: a case study of Weick, K.E., 1990. The vulnerable system: an analysis of the tenerife air disaster. J.
electronic trouble shooting. Ergonomics 17 (3), 293–307. Manage. 16 (3), 571–593.
Rasmussen, J., Lind, M., 1981. Coping with Complexity. Risø Report. Risø National Weick, K., 1993. The collapse of sensemaking in organisation. Adm. Sci. Q. 38 (628–
laboratory, Roskilde. 652), 2005.
Rasmussen, J., Duncan, K., Leplat, J., 1987. New technology and human error. In: Weick, K., 1995. Sensemaking in Organizations. Sage, London.
Goodstein, L.P., Andersen, H.B., Olsen, S.E. (Eds.), Tasks, Errors, and Mental Weick, K., 2015. Ambiguity as grasp: the reworking of senses. J. Contingencies Crisis
Models. Taylor and Francis, London. Manage. 23 (2), 117–123.
Reason, J., 1990. Human Error. Cambridge University Press, New York. Weick, K.E., Putman, T., 2006. Organizing for mindfulness: eastern wisdom and
Reason, J., 1997. Managing the Risk of Organisational Accidents. Ashgate. western knowledge. J. Manage. Inq. 15 (3), 275–287.
Reason, J., 1998. Commentary Broadening the cognitive engineering horizons: more Weick, K.E., Roberts, K.H., 1993. Collective mind in organizations: heedful
engineering, less cognition and no philosophy of science, please. Ergonomics 41 interrelating on flight decks. Admin. Sci. Quat. 38 (3), 357–381.
(2), 150–152. Weick, K., Sutcliffe, K., 2003. Managing the Unexpected. Jossey-Bass, San Francisco.
Reason, J., 2008. The Human Contribution. Accidents, Unsafe Acts and Heroic Weick, K., Sutcliffe, K., 2006. Mindfulness and the quality of organizational
Recoveries. Ashgate. attention. Organ. Sci. 17 (4), 514–524.
Reason, J., Mycielska, K., 1982. Absent-Minded? The Psychology of Mental Lapses Weick, K., Sutcliffe, K., 2012. Information overload revisited. In: Hodgkinson, G.P.,
and Everyday Errors. Prentice Hall, Englewood Cliffs (NJ). Starbuck, W.H. (Eds.), The Oxford Handbook of Organizational decision making.
Roberts, K.H., 1989. New challenges in organisational research: high reliability Oxford University Press, Oxford.
organizations. Ind. Crisis Quat. 3, 111–125. Weick, K., Sutcliffe, K.M., Obstfeld, D., 1999. Organising for high reliability:
Roberts, K.H., 1990. Some characteristics of one type of high reliability in processes of collective mindfullness. Res. Organ. Behav. 21, 81–123.
organisation. Organ. Sci. 1 (2), 160–176. Wildaavsky, A., 1988. Searching for Safety. Transaction Books, New Brunswick, N.J..
Roberts, K. (Ed.), 1993. New Challenges to Understanding Organizations. Macmillan Woods, D., 1988. Coping with complexity: the psychology of human behavior in
Publishing Company, New York, NY. complex systems. In: Goodstein, L.P., Andersen, H.B., Olsen, S.E. (Eds.), Mental
Roberts, K.H., 2009. Book review essay managing the unexpected: six years of HRO- Models, Tasks and Errors. Taylor & Francis, London, pp. 128–148.
litterature reviewed. J. Contingencies Crisis Manage. 17 (1). Woods, D., 2015. Four concepts for resilience and the implications for the future of
Roberts, K.H., Rousseau, D.M., 1989. Research in nearly failure-free, high-reliability resilience engineering. To appear in Reliability Engineering and System Safety.
organizations: having the bubble. IEEE Trans. Eng. Manage. 36 (2). Doi: http://dx.doi.org/10.1016/j.ress.2015.03.018.
Roberts, K.H., Hulin, C.L., Rousseau, D.M., 1978. Developing an Interdisciplinary Woods, D., Cook, R.I., 1999. Perspective on human error: hindsight biases and local
Science of Organizations. Jossey-Bass, San Francisco. rationality. In: Durso, F. (Ed.), Handbook of Applied Cognition. Wiley, pp. 141–
Rochlin, G.I. (Ed.), 1974. Scientific Technology and Social Change. Readings from 171.
’Scientific American’. W. H. Freeman, San Francisco. Woods, D., Cook, R.I., 2003. Mistaking error. In: Hatlie, M.J., Youngberg (Eds.),
Rochlin, G.I., 1989. Informal organizational networking as a crisis-avoidance Patient Safety Hanbook. Jones and Bartlett.
strategy: US naval flight operations as a case study. Ind. Crisis Quat. 3, 159–176. Woods, D., Johannesen, L., Cook, R.I., Sarter, N., 1994. Behind Human Error:
Rochlin, I.G., 1993. Defining ‘high reliability’ organizations in practice: a taxonomic Cognitive System, Comuters and Hindsight. Crew System Ergonomics, Wright-
prologue. In: Roberts, K. (Ed.), New Challenges to Understanding Organizations. Patterson AFB, OH.
Macmillan Publishing Company, New York, NY.
Rochlin, G.I., 1996. Reliable organisations: present research and future directions. J.
Contingencies Crisis Manage. 4. Further reading
Rochlin, G., 1999. Safe operation as a social construct. Ergonomics 42 (11), 1549–
1560. La Porte, T.R., 1996. High reliability organizations: unlikely, demanding, and at risk.
Rochlin, G.I., La Porte, T.R., Roberts, K.H., 1987. The self-designing high-reliability J. Contingencies Crisis Manage. 2 (4), 60–71.
organization: aircraft carrier flight operations at sea. Navar War College Rev. 40
(4), 76–90.
Please cite this article in press as: Le Coze, J.C. Vive la diversité! High Reliability Organisation (HRO) and Resilience Engineering (RE). Safety Sci. (2016),
http://dx.doi.org/10.1016/j.ssci.2016.04.006