Documentos de Académico
Documentos de Profesional
Documentos de Cultura
Early DFIC discussions about role of human factors in dam failures and
incidents
Broad research into role of human factors in failure and safety, drawing on
diverse fields
Aviation, health care, nuclear power, motorsports
Management, social sciences, philosophy
Work by others in the dam safety community
Contributors to Failure
Contributors to Safety
Conclusions
These steps form timelines which may precede failure by years or even decades
Linear sequential narrative timelines are easier, but interactions between factors
may be complex
Nonlinear relationships
Feedback loops
Causes having multiple effects
Effects having multiple causes
Root causes or dominant contributing factors may not be readily identifiable
So humans are both the problem (error) and solution (success), two
sides of the same coin
Functional goals
Water supply, irrigation, flood control, hydropower, recreation
Safety is more a constraint than a goal
Schedule pressure
Personal agendas
Political pressure
Competition
Category Examples
Bounded rationality Finite cognitive processing capacity satisficing
Misperception Not seeing soil particles in seepage, misclassification of soil or rock
IGNORANCE/UNCERTAINTY Inadequate subsurface investigation
incomplete or inaccurate
information & knowledge Immature engineering state-of-the-art, insufficient relevant experience
Misapplied heuristics Use of an engineering rule of thumb outside its traditional context
Unreliable intuition Atypical, unprecedented, or complex design situation
Inaccurate memory Misremembering or forgetting to document an inspection observation
Fatigue effects Long work shifts, compressed schedules
Emotional effects Apathy, indifference, frustration, pride
INACCURATE MODELS Significant 3D behavior missed by using a 2D model
Dunning-Kruger effect: very unskilled individuals greatly overestimate their ability,
COGNITIVE BIASES
highly skilled individuals somewhat underestimate their ability
Category Description
Unknown knowns Known, but were not aware that we know it (eg, intuition and tacit knowledge)
Denied knowns Known to some degree, but we repress it because of psychological or social reasons
Taboo unknowns Potentially knowable, but not explored because of psychological or social reasons
Known unknowns Partly unknown, but the uncertainty can be modeled (eg, probability distributions)
Unknown unknowns Unknown and entirely missing from our models (eg, unknown dam defect)
INACCURATE MODELS
Difficulty in Validating Models
Necessity of Models
Computational models have a black box aspect,
Our interactions with the world are always which inhibits intuition and checking
mediated by models
Safety factors can prevent models from truly being
Models may be subconscious, conceptual, tested
mathematical, computational, or physical
Subjectivity and Biases in Modeling
Uncertainty of Models
Models are developed by people, for various goals,
All models are incomplete and inaccurate, to and thus have a subjective aspect
varying and usually unknown degrees
Modeling is subject to various cognitive biases
Model inaccuracy may be both qualitative and
quantitative
Biases result from subconscious cognitive processes which systematically distort thinking
relative to reality
Biases often have detrimental effects on decision-making, but may be beneficial in some
contexts
Recency Giving greater weight to recent Using a model because it was used recently,
events over prior events, without despite overall experience suggesting a different
justification model
Source: Dams as Systems A Holistic Approach to Dam Safety by Pat Regan (2010)
HUMAN ERROR
Humans must adaptively cope with diverse, complex, and uncertain
situations with conflicting goals and time pressure
Error is usually judged based on outcomes, after the fact
The same action may have a good or bad outcome, depending on circumstances
beyond someones control
Good actions may have bad outcomes, and bad actions may have good outcomes
We arguably have a degree of free will but it can never be entirely free, due
to external and subconscious influences beyond our control and possibly
beyond our awareness
So its tricky to (a) determine what behavior is reasonable versus negligent,
and (b) assign blame and liability pragmatically focus on desired outcomes
Fundamental attribution bias bad outcome of others is due to them, whereas your
bad outcome is due to the situation
Action as planned, Rule-based mistake Misapplied good rule, or applied bad rule
but inadvertent
error in thinking Knowledge-based mistake Inaccurate knowledge or judgment
Caution Successful track records may foster ignorance, complacency, and overconfidence!
SAFETY CULTURE
Safety culture typically leads to implementing best practices, and is typical in dam
engineering
Maxims
Customization to project sites, Open and effective information sharing, including allowing dissent and documenting
thoroughly, to connect the dots among dispersed and fragmentary information
including scenario planning
during design and SAFETY-ORIENTED PERSONNEL SELECTION
testing/adaptation during
construction (observational DIVERSE TEAMS, but with leadership, continuity, and avoiding diffusion of
method) responsibility
Progressive and controllable Appropriate failure modes, including operational failure modes and failure modes in
failure modes which produce the proximity of dam sites
warning signs
Professional, ethical, and legal/regulatory standards
Accurate hazard classification Learning from failures and incidents (www.damfailures.org, Decade Dam Failures)
and good emergency action
planning WARNING SIGNS VIGILANT MONITORING, THOROUGH INVESTIGATION, AND EFFECTIVE RESPONSE
DIVERSE TEAMS
Desired type of diversity is cognitive diversity, which brings in diversity of
perspectives, education, training, experience, information, knowledge, models,
skills, problem-solving methods, heuristics, biases, etc.
Diversity trumps ability for difficult problems, a team of diverse people with
relevant abilities will often outperform a homogenous team of the best people,
since a diverse team covers more bases (additivity), provides checks and balances,
and may also have synergy (superadditivity)
Checklists are not fun to use, but they foster discipline and vigilance
PFMA and risk analysis focus on what may happen in the future, whereas
failure investigation focuses on what likely happened in the past
Aside from the time dimension, they have much in common: gathering and
weighing information, performing mechanistic analysis, formulating hypotheses
for failure scenarios, evaluating the hypotheses, and dealing with uncertainty by
subjectively estimating the likelihood of the hypotheses
PFMA/risk analysis can be applied as a best practice during dam design to help
ensure that all failure modes have been adequately addressed less unknown
unknowns
Specialists in PFMA/risk analysis and failure investigation may consider working in
both fields
Challenge: How do we explicitly incorporate human factors into PFMA/risk analysis?
Failure investigations have traditionally focused on physical factors, but human factors
are often an important part of the story of failure
Searching for one or a few root causes may be oversimplified may need to tell a
complex story
Hindsight bias and fundamental attribution bias distort our understanding of why people
did what they did put yourself in their shoes, and try the substitution test
Its just as important to ask what people didnt do (neglecting best practices) as what they
did do (errors)
Definitive conclusions may not be reached, so the case may remain open
Core/cutoff wall silty sand mixed with No filter or anti-seep collars for conduit
bentonite clay
Consequences
Over 100 structures impacted
No fatalities (EAP activated)
$1.1 million legal settlement
Mid to late 1980s Design with lack of filters/drains and inadequate depth of
cutoff
1990 and 1991 Construction using erodible soils for embankment, and
permeable soils for core and cutoff
1993 Normal pool reached, wet spots on downstream face, and responded
with remedial installation of drains at downstream face and toe
1999 Seepage around conduit outlet and silt in outlet basin; responded with
remedial excavation/backfilling around outlet, but no flow in drains after 2
months (likely due to clogging of filter fabric)
2004 Piping failure 13 years after construction, sinkhole found in upstream face
Mid to late 1980s Design was apparently led by a young Engineer with little
or no prior dam design experience, apparently with little or no peer review
Geotechnical modeling was apparently not performed for seepage and piping
The design had unconservative and non-redundant seepage/piping controls,
including lack of filters, drains, and sufficient cutoff depth, and lacked monitoring
systems to detect piping as found in similar dams
The plans were of poor quality and had no PE seal, thus not meeting professional
standards
1990 and 1991 Construction using erodible and permeable soils (test results
did indicate excessive permeability, but this warning sign was missed)
Apparently the first major project of the contractor, raising questions about
expertise
Construction inspection was inadequate, as evidenced by missed warning signs
such as conduit defects
2002 Same Engineer was authorized by owner to inspect annually and study
seepage, and the Maintenance Person was directed to inspect weekly
Seepage analysis apparently was not performed
Maintenance Person lacked expertise
Interaction of human and physical factors was fairly intense from design until
failure, and the Engineer is a lead character in the story
Inspections by Engineer and others showed warning signs of piping, but they
werent interpreted as warning signs denial due to confirmation bias?
For human factors, safety demand was high, and safety capacity was low
Primary drivers of failure were substantial
Pressures from non-safety goals included social pressure related to relationship
between owner and engineer, and possibly also personal agenda of the engineer
Human fallibility and limitations were evident with respect to misperception,
ignorance, unreliable intuition, inaccuracy of models, and cognitive biases
There was substantial physical complexity related to seepage and piping processes
February/March 2006 42 days of heavy rain, which was the 2nd or 3rd wettest
such period over the past 50 years
March 14, 2006, 5:00 am 24 days into the period of heavy rain, the dam
breached, apparently due to about 2 maximum overtopping near the former
spillway (no spillway was found after breach), with flood depth of 10 to 30
1998 Owner was cautioned by a local real estate agent (by fax) that the
spillway had been filled, which will result in overtopping, and recommended
restoring the spillway, but there was apparently no response from Owner and
no remedial action
1999 to 2001 DLNR sent three letters to Owner to schedule dam inspection,
and a letter recommending review or development of EAP; there were no
responses from Owner, no inspections, and no EAP developed (Ka Loko still
had low-hazard classification, but regulations required inspection every 5
years)
1999 to 2006 DLNR lost funding in 1999 for consultant inspections, lost
more funding in coming years, and supervisor retired in 2005 (leaving 1.5 FTE
for dam safety versus about 6.5 FTE desirable), so no inspections were
performed in 2005 nor early 2006
February/March 2006 42 days of heavy rain, which was the 2nd or 3rd wettest
such period over the past 50 years
Late February 2006 Small bridge was destroyed by flooding near the
reservoir, so several people (none from DLNR) inspected the dam but the lack
of a spillway was not noted
March 14, 2006, 5:00 am 24 days into the period of heavy rain, the dam
breached, apparently due to about 2 maximum overtopping near the former
spillway (no spillway was found after the breach)
Flood depth of 10 to 30
7 fatalities (including a pregnant woman) about 16 minutes after breach
Prison time for Owner due to reckless endangerment, and a civil settlement of
many millions of dollars
Owner had grading done, despite lacking dam expertise and permits (possible
overconfidence bias, and lack of deference to expertise, peer review,
diverse team, information documentation and sharing, and
professional/ethical/legal standards)
Grading was reportedly done to increase property value and create a scenic
location for a home for the Owner (profit pressure and personal agenda)
Owner and many others appeared to not understand the need for a spillway (lack
of expertise and unreliable intuition), which greatly reduced the design safety
margin and redundancy, and contributed to rapid failure (compromised general
design)
The two people who did understand the risk of filling the spillway expressed their
concern only to Owner (personal relationship), but Owner didnt act on their
warnings (missed warning sign and possible denial bias)
DLNR had funding cuts and was very understaffed, hence no inspection of Ka Loko
Dam despite the required 5-year interval (cost and schedule pressure and falling
short of legal standard), and such inspection would very likely have identified the
lack of spillway (missed warning sign)
Government agencies (other than DLNR) inspected grading violations, but focused
on environmental damage rather than dam safety (missed warning sign)
Human factors and physical factors interacted, but human factors dominate
the story, and the Owner is the lead character in the story
DLNR likely would have inspected the dam and addressed the lack of spillway
even with low-hazard classification if funding cuts hadnt created major
cost and schedule pressures
Those warnings may not have been conveyed to DLNR because of social
pressures, and political pressure also contributed to missing warning signs
Risk management was compromised in all three ways: ignorance, complacency, and
overconfidence
Owners approach didnt reflect safety culture, and DLNR lacked funding for safety
culture
Nearly all best practices were neglected, including safety margins, accurate hazard
classification, information sharing, diverse teams, deference to expertise, peer review,
professional/legal standards, and addressing warning signs
Example of a successful project despite high risks best practices enable success
even in highly difficult circumstances
Scope for Alvi Associates included inspection, forensic investigation, design, and
construction management (15-year project)
Stability analysis was performed for gatehouse, with parametric sensitivity study
to address uncertainties, and revealed many scenarios with factor of safety < 1.0
Due to failure consequences on the order of $100 million, risk was judged high
enough to warrant spending $6 million on rehabilitation
38 post-tensioned steel threadbar anchors
Anchor lengths from 48 to 70 feet
Anchor slopes of 6 and 15
Core-drilling for anchors
Underwater construction at water depths reaching over 100 feet
Many features to avoid
Irfan A. Alvi, PE
ialvi@alviassociates.com