If you are interested in the topic please register to the ALIAS network:
http://network.aliasnetwork.eu/
to download other materials and get information about the ALIAS project (www.aliasnetwork.eu).
Exploring the Future Potential of AI-Enabled Smartphone Processors
Air disasters as organisational errors: the case of Linate by M. Catino
1. ALIAS Conference 14-15 June 2012, EUI - Florence (Italy)
Air disasters as organizational
errors: the case of Linate
Prof. Maurizio Catino
University of Milan - Bicocca (Italy)
maurizio.catino@unimib.it 1
10. Why? Who is to blame?
Cessna pilots mistake
Ground controller error
Inadequate signals condition
Absence of a ground radar
Airport management negligence
Tragic fatality
… 10
11. The Error of Human Error…
“... ‘human error’ is not a well defined category of
human performance. Attributing error to the actions
of some person, team, or organisation is
fundamentally a social and psychological process
and not an objective, technical one.”
(Woods et al., 1994)
Assume that the
source of failure is
“human error”
Analyse events to
find where a
person is involved
Stop analysis
when one is found
11
12. A multilevel model for the
analysis of accidents
Inter-organizational level
- Integration
- Coordination Defences
-…
Individual-level
(errors, violations,
mistakes, decisions) Accident
Organizational level
- Defences
- Managerial decisions
- Error-inducing conditions
-… (Catino 2010) 12
13. 1. Individual Level
• The Cessna and two pilots were not qualified and
certified to operate in low visibility conditions (land
and take off) such as that day (violation)
• The Cessna crew took the wrong taxiway (error) and
entered the runway without specific clearance
(violation)
• There were communication failures between the
tower and the Cessna pilots: the ground controller
did not realize that the Cessna was on taxiway R6
(error), and he issued a clearance to taxi towards the
main apron although he could not make sense of the
report position S4
13
14. 2. Organizational Failures
Failures defences
No Surface Movement Radar (out of service since November 1999)
Installed equipment for prevention r.i. at R6 intersection deactivated
TWY Lights
Stop Bars
Error-inducing conditions
The ground markigs were not clearly visible (RWY Holding
Position Markings)
Signs, signals and lights were inadequate and misleading (out
standard ICAO)
Official documention failed to report the presence of
unpublished marking (S4, S5, etc)
Latent failures
No learning from near miss
Best practices not applied
No functional Safety Management System
14
16. Individual Failures Organizational and
Inter-organizational Failures
Markings and signs were not in accordance with
ICAO standards; Red bars and TWY lights non
The Cessna crew took the wrong controllable by ATC; Deficiency in the state of
taxiway (error) and entered the runway implementation and maintenance of airport
without specific clearance (violation) standard signage; Official documentation failed to
report the presence of unpublished markings
(S4); No equipment to prevent runway incursions
No surface movement radar; Installed equipment
There were communication failures for prevention r.i. at R6 intersection deactivated;
between the tower and the Cessna Markings and signs were not in accordance with
ICAO standards; Deficiency in the state of
pilots
implementation and maintenance of airport
standard signage; Non-compliance with
international standards on markings, lights and
signs; High traffic volume; lack of visual aids
The Cessna and two pilots were not
qualified and certified to operate in low Lack of coordination among the airport
visibility conditions (land and take off) authorities; weaknesses in the control system
such as that day (violation)
16
17. Failure Levels
Inter-organizational level
• Cost/safety trade-offs
• Failures of integration and coordination
• Bureaucratic safety culture
• No Safety Management system
• …
Organizational - level
• No ground radar
Individual-level • No international safety
standard
• Errors • Weak defenses
• Violations • Lack of visual aids
• Communications • No learning from near miss
misunderstandings • …
17
18. Active versus Latent Failures
Inter- Latent Conditions
Organizational Coordination neglect
Factors
Inadequate safety policies
Organizational Latent Conditions
Factors No ground radar; no international standard
No learning from near miss; …
Preconditions Latent Conditions
for Poor visibility of R5/R6 signs; Mental Fatigue;
Unsafe Acts
S4 marking unknown to the controller; …
Unsafe Active Conditions
Acts • The Cessna crew took the wrong
taxiway and entered the runway
Failed or • Communication failures
Absent Defenses
Accident & Injury
(Adapted from Reason, 1997) 18
19. Conclusions
• If we focus too closely upon the unsafe acts at
the sharp end, we are in danger of missing the
fact that this was the result of an organizational
error
• It’s important to take a system perspective
• Communication and organization problems of
many kinds were crucial factors in this and other
disasters
19
20. Two ways of looking at accidents
Individual Blame Logic Organizational Function Logic
Errors and
Accidents
20
21. Vicious Circle
Individual Organizational inertia
Blame Logic Defensive behavior
Blame culture
Search for the guilty Hidden errors
21
22. Defensive Medicine?
• Defensive medicine takes place when healthcare
personnel prescribe unnecessary treatments, or avoid
high-risk procedures, with the goal of reducing their
exposure to malpractice litigation
• Doctors in particular may:
• prescribe unnecessary tests, procedures or
specialist visits (positive defensive medicine),
• or, alternatively, avoid high-risk patients or
procedures (negative defensive medicine).
22
23. Defensive Medicine
Study Year Country Result
(% of defensive
behaviours)
Tancredi 1978 US 70%
Studdert et al. 1995 US 93%
Summerton 2000 UK 90%
Hymaia 2006 Japan 98%
Jackson Healthcare 2008 US 72%
Massachusetts 2009 US 83%
Medical Society
23
26. The side effects of defensive medicine
• The threat of legal investigation does not make the
medical system more careful and attentive toward the
patient
• Individual blame logic does not improve patient safety
• Develop the capacity to learn from errors and system
failures to become more resilient and reliable
• To achieve this, a profound cultural and juridical
transformation is required
• Promote a different culture to reduce defensive medicine
and to promote a process of learning from error
26
27. Virtuous Circle
Organizational Removing latent factors
Function Logic Organizational learning
Just culture
Search for Reporting close calls,
organizational criticality errors
27
28. Getting the balance right
Person model System model
Proximal Remote
factors factors
Individual Collective
responsibility responsibility
Both extremes have their pitfalls
(Reason, 1997) 28
29. Blame free Just Punitive culture
culture
All errors to system failure Individuals are
No individual is to be held blamed for all
accountable mistakes
29
31. Establishing a Just Culture
At-risk Reckless Malicious
Human error
behavior behavior behavior
Inadvertent A choice: Conscious Violations
action: slips, risk not disregard of Gross
lapses, recognized unreasonable negligence
mistakes or believed risk Criminal
justified offences
Reassure Coach Punish
Unintentional Deliberate
No blame Culpable 31
32. The Case of the
Italian Air Force
• 20 flight divisions;
1000 pilots
• 1990: The accident
of “Casalecchio di
Reno”: 12 people
died
• New organization,
new culture
32
33. New risk and safety policy
• The promotion of a new vision of risk management and
safety
• The promotion of methods for the identification,
analysis and prevention of risks (critical latent factors)
• Database for incident reporting (voluntary and
anonymous for the centre)
• Ongoing training and education about safety and
perception of errors in order to learn from them
• The implementation of a just culture
33
34. Two different strategies
compliance vs. deterrent
A deterrent strategy (blame culture)
• is backward-looking,
• implemented after the accident happens
• punitive, sanctions directed towards the
individuals or organizations responsible for an
error or accident
A compliance strategy (ITAF - just culture)
• is forward-looking and preventive
• early identification of errors and latent factors
34
35. Just culture at ITAF
(extracts from interviews)
• For each event we look for the reason why it happens.
We do not talk about blame and responsibility. We do
not want to know who the guilty person was but why the
event happened and what we can do to avoid it in the
future.
• Error is a mechanism for learning (… there are some
)
errors that if analyzed can help prevent future errors.
• The more people I inform about my error, the less they
risk repeating the error
• The organization does not put pressure on people
committing an error. Nobody is afraid of being punished.
The debriefings are a training activity to talk and improve
our work. The exchange among experts and newcomers
is a good occasion for both people as it helps to see
things from different points of view. 35
39. Conclusion
Either organizations manage human
errors, by learning from them
Or…
human errors will manage organizations
To achieve the first one, is fundamental to
develop a just culture
39