Está en la página 1de 17

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB

Time to think and time to


do?
I can fail, and so can you!
Erik Hollnagel

Dept. of Computer and Dedale


Information Science 15, Place de la Nation
University of Linköping F-75011 Paris, France
SE-581 83 Linköping, E-mail:
Sweden ehollnagel@dedale.net
E-mail: eriho@ida.liu.se
© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB What is this thing called “error”?
“To err is human, to forgive
divine.”
An essay on Criticism (1711) by
Alexander Pope (1688-1744)

It is one thing to show people


they are in an error, and another
to put them in possession of
truth.
John Locke (1632-1704) An Essay
Concerning Human Understanding, Bk.
IV, Ch. 7

© Erik Hollnagel,
2003

1
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB When things go wrong …

A rush for
Organisatio explanations
nal failure
Human “Act of
“causes Technical god”
” failure © Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Why do we look for “errors”?
Actions are the
result of
dispositions
Assumption: the People are
source of error is the Fundament free to
human factor. al choose their
attribution actions
error
Analyse to find Illusion Causes and
where a person is of free consequen
involved. will ces are of
similar size

Stop analysis when Magnitu


one is found. de bias

“Safe bet” - all


systems involve
humans somewhere
© Erik Hollnagel,
2003

2
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB “Human error” – or what?
Error in setting pumps
Human error in setting induced by design – no
pumps. feedback.
4% attributed
Accidental tubing Spontaneous tubing
disconnections. errors due to disconnections.
Confusion between device use. No means to visually
central and peripheral differentiate between
lines. central and peripheral
lines.
Infusion Pump and Parenteral Delivery Problems
Harvard Adverse Drug Event Study. (Leape et al,
1995)

Alignme Error
nt of provokin Incident
factors g
condition
“… error is the result of an alignment of
conditions and occurrences each of which
is necessary, but none alone sufficient to
cause the error”.
(Bogner, 1998)
© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB From reasoning to actions
“ In all demonstrative Domino Theory
sciences the rules are of Accidents
certain and infallible; but (Heinrich, 1931)
when we apply them, our
fallible said uncertain Industrial injuries result only from
faculties are very apt to accidents.
depart from them, and fall
into error.” Accidents are caused directly only
1711-1776 by
(a) the unsafe acts of persons or
(b) exposure to unsafe mechanical
conditions.

Unsafe actions and conditions are


caused only by faults of persons.

David Hume Faults of persons are created by


“ A Treatise of Human environment or acquired by
Nature” , Part IV, Section I. inheritance.

© Erik Hollnagel,
2003

3
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB The Domino theory - outside view

ry

nt
on
st

rd

ry
de
rs
ce

za

ju
ci
pe
An

Ha

In
Ac
Mechanical & physical
environment

Unsafe act

Accident
Fault of
Social

Injury
© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB The Domino theory - inside view

© Erik Hollnagel,
2003

4
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB When things go right!
1
0,9 Major accidents per million hours
0,8 flown
0,7
0,6
Major
0,5
0,4 Serious
0,3
0,2
0,1
0 “Medial
error”
83

85

87

89

91

93

95

97

99

01
deaths:
19

19

19

19

19

19

19

19

19

20
44.000
Accident rate 0.80 / million
vehicle-miles travel / year
(freeways)
Accident rate 2.9 / million
vehicle-miles travel / year (two
lane highways)
Admission
s:
36.500.000 © Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE Hypotheses about accidents and


LAB
actions
Hypothes Actions The study of
is #1: leading to Accidents failures
failure cannot
Actions benefit from
leading to Normal the study of
success actions
successes

Hypothes
is #2: The study of
Accidents failures must
Actions be based on
in general the study of
Normal successes
actions

"Knowledge and error flow from the same


mental sources, only success can tell one
© Erik Hollnagel,
from the other." 2003
(Ernst Mach, 1838-1916)

5
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB What should we be looking for?
But performance variations can be
positive as well as negative!
Why Why
Why ?
? ?
Invention
Improvem
ent
Smart
move
Time
Shortcut
Performance

Unsafe act
variability

Near miss
Incident
Why Why Accident
? ?
Human factors has tended to look for
negative aspects of performance -
deviations or “errors”
© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Why things work!
Optimistic
assumption systemsThings go right because
are well designed and scrupulously
maintained,
procedures are complete and correct
people behave as they are expected to
designers can foresee and anticipate every
contingency.
Systems can be improved by
restraining human variability
Realistic Things go right because
assumption learn to people
overcome design flaws and functional
glitches
interpret and apply procedures to match
conditions
adapt their performance to meet demands
can detect and correct when things go wrong
Systems can be improved by
accommodating human
variability © Erik Hollnagel,
2003

6
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB TTT - things take time
Time available

Time needed
Time to evaluate
event Time to
Time to select think
action
Time to do

Need to do Time when Time when it Time


something! it can be must be
(Intention) done done
Earliest starting Earliest finishing
time time
Latest starting Latest finishing
time time

© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Everything happens in time
TD =
If (TE + TS) exceeds feedback
T A, then the operator time delay External
will lag behind the event /
TD disturbance
process and may Events
gradually lose / TP =
control. feedbac TP estimated
TP<TA? k performance
If (TE + TS) is less than time
T A, then the operator TA =
will be able to e.g. TE TE = time T A available
refine the current to evaluate time
understanding. event (context
dependent)

Construc TS = time to Action


Level of
of control
control will
will t select
Level action
vary depending
vary depending on on
performance
performance TS
conditions
conditions
© Erik Hollnagel,
2003

7
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE Working to rule - design


LAB
assumptions
Demands and resources
are compatible.

System input is Output (actions) will


regular and comply with norms.
predictable

Other people
behave as
prescribed

Working conditions fall


within normal limits.

… no need to make adjustments


© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB … but in reality
Demands vary and
resources may be
inadequate.

System input may be Output (actions) will


irregular and vary considerably.
unpredictable

Other people
behave
egocentrically

Working conditions may


at times be sub-optimal.

… necessary to make local adjustments


Efficiency-Thoroughness Trade-Off (ETTO) © Erik Hollnagel,
2003

8
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE ETTO: Efficiency-Thoroughness


LAB
Trade-Off
Rules,
good
practice,
experience
Conflicting
demands,
incomplete
information, time
pressure
People invariably make performance
adjustments, which are seen as
Mandatory effective and “intelligent”.
checklist, Deviations are normally detected and
procedures recovered in time.
Successful adjustments are used
even when they should not be. In
hindsight, this is called “error”.
© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Some ETTO heuristics
Individual Individual (work related) Organisation
(cognitive) al
Judgment Negative
under Looks fine
uncertainty reporting
Cognitive Not really important
Reduce
primitives Normally OK, no need to redundancy
(SM – FG) check
Will be checked by Double-bind
someone else
Has been checked by
someone else
Can’t remember how to do
it
No time - no resources -
do it later
Worked last time © Erik Hollnagel,
2003

9
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Herald of Free Enterprise
Modern Ro-Ro passenger/vehicle ferry with two
main vehicle decks. At Dover and Calais double–
deck ramps connected to ferry. Zeebrugge only
had single-level access ramp which could not quite
reach upper vehicle deck.
Water ballast was therefore pumped into bow tanks
Tto facilitate loading.
E T When leaving Zeebrügge on March 6, 1987, not all water had been
Opumped out of ballast tanks, causing her to be some 3 feet down at the
bow.
Assistant bosun, who was directly responsible for closing the doors, was
asleep in his cabin, having just been relieved from maintenance and
T Tcleaning duties
E Bosun did not see door closing as part of his duties.
O
Captain apparently assumed that doors were safely closed unless told
E T Totherwise.
O
Chief officer, responsible for ensuring door closure, testified he thought
he saw the assistant bosun going to close the door
The Herald had clamshell doors which opened and closed horizontally.
E T TThis made it impossible for the ship’s master to see from the bridge if the
Odoors were closed.
The Herald backed out of the berth stern first. As the ship rapidly
accelerated to 22 knots service speed, a bow wave began to build © Erik up
Hollnagel,
under her prow. At 15 knots, with the bow down 3 feet lower than 2003 normal,
water began to break over the main car deck through the open doors at

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Why do actions sometimes fail?
Inefficient or Lack of adjustment
deficient to time of day
organisation (circadian rhythm)

Inadequate training Incompatible


and experience working conditions

Inefficient crew Inappropriate HMI


collaboration and operational
support

Lack of adequate Inefficient


procedures / plans communication
Shortage of Too many
resources (both simultaneous goals
human and and too little
technological) available time

Blunt end? Sharp end?

© Erik Hollnagel,
2003

10
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Efficiency-Thoroughness Trade-Off

No time - no resources - will do it


later
Conflicting
demands Has been checked by someone else
Will be checked by someone else
Incomplet
e Normally OK, no need to check
informatio
n Activity Can’t remember how to do it
Workpla
ce
Manageme Not really important
Time Compan
nt
pressure y
Authorit Worked last time
y
Government Looks fine
Morals, social norms

© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Sources of success
On the level of individual human performance, local
optimisation (shortcuts, heuristics, and expectation-
driven actions) is the norm rather than the exception.
Normal performance is not what is prescribed by rules
and regulation but rather what takes place as a result of
the adjustments (the equilibrium that reflects the
regularity of the work environment).
It is therefore a mistake to look for the cause of failures
in the normal actions since they, by definition, are not
wrong.
Normal actions are successful because people adjust to
the local conditions and correctly anticipate the
developments.
Failures occur when this adjustment goes awry, but both
the actions and the principles of adjustment are correct.
© Erik Hollnagel,
2003

11
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Four postulates
Both normal performance and failure are emergent
1 phenomena, and neither can be attributed to or
explained by specific components or parts.

When the outcome of actions differs from what was


2 intended / required it is due to variability of context
and conditions rather than failures of actions.
The adaptability and flexibility of human work is the
reason for its efficiency.
3 At the same time it is also the reason for the failures
that occur, although it is rarely the cause of the
failures.
People are expected to be both efficient and
thorough at the same time – or rather to be
4 thorough, when with hindsight it was wrong to be
efficient.
© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB The Road to Wisdom

“The road to wisdom?


Well. It’s plain and simple to
express:
Err
and err
and err again
but less
and less
and less.”
(Piet Hein, Grooks, 1966)

© Erik Hollnagel,
2003

12
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Counterfactual reasoning

“Why “Why
didn’t they didn’t they
do A”? do B”? Actual
outcome

Possible Possible
outcome outcome
1 2

Going back through a sequence, investigators often wonder


why opportunities to avoid the bad outcome were missed.
This, however, does not explain the failure
© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB The Devil’s Dictionary
The art of thinking and reasoning in strict
LOGIC, n accordance with the limitations and
incapacities of the human
misunderstanding
Evidence having a shade more of
PROOF, n plausibility than of unlikelihood. The
testimony of two credible witnesses as
opposed to that of only one

RATIONAL, adj Devoid of all delusions save those of


observation, experience and reflection

REASON, v. i.: To weigh probabilities in the scale of


desire

© Erik Hollnagel,
2003

13
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB

1
0,9
0,8
0,7
0,6 Major
0,5
0,4 Serious
0,3
0,2
0,1
0
83

85

87

89

91

93

95

99

01
97
19

19

19

19

19

19

19

19

19

20

© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB

Time available

Time needed

Time to evaluate event


Time to think
Time to select action
Time to do

Time
Need to do Latest Latest
something! starting time finishing time
(Intention)

© Erik Hollnagel,
2003

14
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Freudian slip
Bush's address to teachers in the nation on NBC with some
friends, and this error was GLARING. He said, and I quote,
"First I'd like to spank all the teachers..." there was a short
pause with a definite facial expression change on his part as
he realized his mistake and my friends and I all glanced at
each other, laughing. Then he continued his speech.

© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB We can build reliable systems
An airplane as a closed technical system, even
including the pilot, is highly reliable as long as
the boundaries are well-defined and the
disturbances are not unexpected.

Weather
ATC When it becomes part of
Schedules the environment
(transportation system),
Regulation reliability goes down. There
s are too many constraints,
Maintenanc influences, and
e disturbances
Routes
Gate
capacity
© Erik Hollnagel,
2003

15
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE Needed: A model of normal


LAB
performance
We should understand how things go right
(successes) as well as how they go wrong
(failures)
Invention
Improvem
ent
Smart

E X T Time
move

ONT
Shortcut
Performance

Unsafe act

C
variability

Near miss
Incident
Accident
Risk analysis must be based on a model of
normal performance, and not just on a model
of “error”.
Context is the main determinant of normal
performance, and therefore also of action © Erik Hollnagel,
failures. 2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB Actions and “errors”

2.
2.IfIfthe
theaction
actionleads
leads
1. to
tothetheexpected
expected
1.The
Theaction
actionisis Expect outcome,
chosen
chosenbased
basedon onthethe ed outcome, then then itit isis
event considered
considered aa correct
correct
eventhistory
historyand
andthethe outcom action
current
current situation.
situation. e action
Action The difference
between correct and
Outcome of incorrect outcomes
previous may be vague rather
action than crisp.
Unexpect 3.
4.
4.In
Inhindsight,
hindsight,the the ed 3.IfIfthe
theaction
actionleads
leads
alternative
alternative “correct”
“correct” to
toananunexpected
unexpected
outcome outcome,
action
action isis identified.
identified. outcome, then
then itit isis
classified
classifiedas asan
an
“error”.
“error”.

© Erik Hollnagel,
2003

16
COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE
LAB What is a “human error”?

Actual outcomes
Correctly = intended
performed actions outcomes

Failure detected and recovered Actual outcomes


z intended
Failure detected but tolerated outcomes
Immediate
Failure detected but not recovered effects
Latent effects
Failure not detected

© Erik Hollnagel,
2003

COGNITIVE SYSTEMS ENGINEERING LABORATORY

CSE “Error” rate, detection, and


LAB
performance
Wioland & Amalberti, 1994
“Error” “Error”
rate per recovery rate
hour (percentage)
100
%
14 90%
12 80%
10 70%
8 60%
6 50%
4 40%
2

Relaxed Maximu Loss of


(inattentive Standard performance m perf. control
)
© Erik Hollnagel,
2003

17

También podría gustarte