The most common analysis methods are:

Download Report

Transcript The most common analysis methods are:

 Students







will be able to:
List items in a AI plan
List items to include in an AI kit
Explain why human error could be a cause or a
symptom of a system problem.
Distinguish between active and latent errors
Convince management of the need for identifying
symptom errors
Begin to identify personal biases
Avoid blame as an outcome of AI based on this
theory

The accident investigation process involves the
following steps:









Report the accident occurrence to a designated
person within the organization
Provide first aid and medical care to injured person(s)
and prevent further injuries or damage
Investigate the accident
Identify the causes
Report the findings
Develop a plan for corrective action
Implement the plan
Evaluate the effectiveness of the corrective action
Make changes for continuous improvement
 This
plan should address:

investigator training-Where?

investigation kits-What?

the investigation priorities-Which ones?

gathering of evidence-Who?

preservation of evidence-Who & How?
Human Error and Accident Management offers
means and ways to recognize and prevent these
behaviors (error).
 Provides for a means to control and recover from
these behaviors when they do occur and to
contain and escape from their adverse.
 New approach (last 10 years)


Error Theory is not new
(focused on moral decisions)
First focused on just unsafe acts (cause v. system)
 Origin in major disasters: Three Mile Island,
Aviation Accidents, Challenger Disaster
 Caution: Avoid Blame Game

 Human


error is the cause of accidents
To explain a failure, you look for a failure
You must find person’s inaccurate assessments,
wrong decisions, and bad judgments
 Human
error is a symptom of trouble deeper
inside a system


To explain failure, do not try to find where
people went wrong
Instead, find how person’s assessments and
actions made sense at the time, given the
circumstances that surrounded them
 Concept
Check: Are we all on the “same
page?”
 What
 Give
is your concept of Human Error?
examples of Human Error
 forgetfulness,
inattention, poor motivation,
carelessness, negligence, and recklessness (J.
Reason Western Journal of Medicine,, June 2000)

4 Categories according to James Reason:
 slips,
 lapses
 violations
 mistakes
 Random
versus Systemic Errors

What’s the difference?

Is one type easier to control than the other?
 Based
on aviation accidents (pilot error)
 Active-Human



Error
Cognitive error
Distraction
inattentive
 Latent-Systematic

Inadequate supervision
 Active
errors become very visible in the
evolution of an event.
 The active errors are also the most obvious
occurrences and the most rapidly identified
human contributors in an accident.
 The
higher in the organization these latent
errors are made, the more serious the
consequences at the front line operation.
 Latent errors of strategic nature, such as
defining company policies affect safety
attitudes and the safety culture in the
organization.
 The most serious and dangerous errors to be
tackled.
 Also see terms in the lit “covert failure” and
“Operationally invisible"
 Technique
for Human Error Rate Prediction
(THERP)-quantitative method
 Accident
Investigation Process

What are some ways you as an investigator can
identify human errors as they contribute to the
accident sequence?

Are human errors the root causes for accidents?


Why or why not?
What role does your knowledge about
human error play in your investigation
process?










WAS THE POSSIBILITY OF THE ERROR KNOWN? *
WERE THE POTENTIAL CONSEQUENCES OF THE ERROR KNOWN? *
WHAT ABOUT THE ACTIVITY MADE IT PRONE TO THE OCCURRENCE OF THE
ERROR?
WHAT ABOUT THE SITUATION CONTRIBUTED TO THE CREATION OF THE
ERROR?
WAS THERE AN OPPORTUNITY TO PREVENT THE ERROR PRIOR TO IT'S
OCCURRENCE? *
ONCE THE ERROR WAS COMMITTED, WAS THERE ANY WAY TO RECOVER FROM
IT? *
WHAT ABOUT THE SYSTEM SUSTAINED THE ERROR INSTEAD OF TERMINATING
IT?
WHAT FED THE ERROR, AND DROVE IT TO BECOME A BIGGER PROBLEM?
WHAT MADE THE CONSEQUENCES AS BAD AS THEY WERE?
WHAT (IF ANYTHING) KEPT THE CONSEQUENCES FROM BEING WORSE?

* IF YES, WHY DID THE
EVENT PROCEED BEYOND THIS POINT? IF
NO,
WHY NOT?
Based on tonight’s discussion you should be able
to:
 List items in a AI plan
 List items to include in an AI kit
 Explain why human error could be a cause or a
symptom of a system problem.
 Distinguish between active and latent errors
 Convince management of the need for identifying
symptom errors
 Begin to identify personal biases
 Avoid blame as an outcome of AI based on this
theory
 http://www.ncbi.nlm.nih.gov/pmc/articles/
PMC1070929/
 http://www.eurocontrol.int/eec/gallery/con
tent/public/document/eec/report/2006/017
_Swiss_Cheese_Model.pdf