Design, Human Factors and Safety Thomas B. Sheridan, ScD Massachusetts Institute of Technology.
Download
Report
Transcript Design, Human Factors and Safety Thomas B. Sheridan, ScD Massachusetts Institute of Technology.
Design, Human Factors
and Safety
Thomas B. Sheridan, ScD
Massachusetts Institute of Technology
Definitions
DESIGN
- is the synthesis of a means to serve a human need
- is an art that makes use of science and technology
ERROR
- is an unwanted, unwonted exchange of energy
RISK of an event - is some function of undesired consequences that migh t
occur and the probability of their occurrence. Two common definitions of
risk are:
o (consequences x probability) = expected value, and
o (worst possible outcome)
but risk could be defined in more complex ways
SAFETY
- is acceptable risk
The Usual Steps in System Design
1. Pro blem identification, based on
errors
ine fficiencies
complaints
2. Task analysis
observation
analysis of mental workload
inter views and focus groups
activity recording
analysis of infor matio n flows and situat ion awareness
simulations
3. Mathematical modeling
statistical models
dynamic models
decision theoretical models
event trees
logic trees
cause-consequence models
4. Detailed design/redesign, with help of all actors involved
5. Controlled experiments and simulations to refine and
validate
6. Pilot testing in-situ
What is Human Factors Engineering?
Psychology and systems engineerin g discipl ines applied to
human tasks and hu man-system interaction :
to
understand error causation, inefficienc y
redesign physical environment, techno logy
redesign task and administrat ive procedures
improve training
(The subset o f HFE called ergonomics, which is bio mechanics
and physiology applied to spatial arrangements and physical
work, is of diminish ing importance as auto mation takes over
physical work and hu man tasks become more cognit ive.)
SHEL MODEL OF HUMAN INTERFACES
H
HARDWARE
human- machine
S
SOFTWARE
language
L
LIVEWARE
inter- personal
L
LIVEWARE
body
stressors
E
ENVIRONMENT
The Procrustean bed:
forcing the human to fit the technology
DISPLAY-CONTROL COMPATIBILITY
(E.G., THE STOVE BURNER CONTROL PROBLEM)
DESIGN OF SCALES AND NUMBERING
6
10
3
0
0
Combining two related variables
into one integrated display
phas e-change
line
temperat ure
gas
s tate
liquid
press ure
Ecological display in process control
inf low
rate
A
inf low
rate
B
maximum
lev el
lev el
lev el
inc reas ing
out f low
rate
des ired
out f low
range
Temporal Analysis of Nurse Tasks (in Surgical Procedure)
Concurrence of Exits, Handoffs, Counting Activities with Procedure Benchmarks
Exits
Handoffs
Counts
12:00:00 PM
Intubation
2:24:00 PM
Incision
4:48:00 PM
Porta Hepatis Dissection
Liver Resection
7:12:00 PM
Fascial
Closure
Table 3: Safe ty Compromisin gEve n ts an d Contribu tin gan d Compe n satory Factors
A
Wound
dehiscence
Intra-operative
tissue injury
requiring
surgical revision
#1
Intra-operative
tissue injury
requiring
surgical revision
#2
Medication
administration
error # 1
B
C
D
E
G
H
I
J
K
L
M
N
O
P
Q
ND
Self
Wound
contamination #
2
Hypothermia
Other
Wound
contamination #
1
Inadequate preoperativ e
preparation
Near-injury to
inexperienced
assistant
F
Eve n t
De te ction
**
Medication
administration
error # 2
Adverse drug
reaction
C ompe n satory
Factors*
C on tribu ting Factors*
Eve n t
Predictor display (for train)
(mov ing window)
(1) speed limits
(2) opt imum t raject ory
radius
sp ee d
340
320
300
280
260
240
220
200
180
160
140
120
100
80
60
40
20
0
(3) predict ion
(4) serv ice braking
(5)
emergency
braking
7
8
9
10
kilometer post s
11
12
13
track curv at ure
14
15
16
Supervisory control
f eedback of sy stem state
human
human
operat or
computer
f eedback of comput er's
understanding of commands
cont rolled
process
Levels of automation
T able 1. A Scale of Degrees of Automation
1.
The computer offers no assistance; the human must do it all.
2.
The computer suggests alternative ways to do the task.
3.
The computer selects one way to do the task and
4.
executes that suggestion if the human approves, or
5.
allows the human a restricted time to veto before automatic execution, or
6.
executes the suggestion automatically, then necessarily informs the human, or
7.
executes the suggestion automatically, then informs the human only if asked.
8.
The computer selects the method, executes the task, and ignores the human.
Telepresence (e.g, in materials handling)
ment al model, inc luding
s ens e of being present wit h
env ironmental object (s )
teleoperat or or comput er
(hardware or s of t ware)
human operat or, dis play
and c ont rol interf ac e
real or v irt ual env ironment al object (s )
Reason’s model of an accident:
penetration of multiple barriers
Qu i c k T i m e ™ a n d a
T I F F (L Z W ) d e c o m p re s s o r
a re n e e d e d to s e e th i s p i c t u re .
SOME CAUSES OF HUMAN ERROR
Lack of feedback
Capture
Invalid mental models
Wrong track of hypothesis verifi cation
Stress and perceptual narrowin g
Risk homeostasis
State of the nervous system
Shift work: fitness for duty
CIRCADIAN EFFECTS
perf ormance
09
12
Noon
error
15
18
21
24
03
Midnight
06
09
ERROR THERAPIES
• Design for ease of use
• Education and training
• Prevention or inhibition of exposure
• Computer-based decision aids
• Alarms
• Posted warnings
Metaphor of Organizational Resilience
to unpredictable incidents and anomolous events
Qu i c k T i m e ™ a n d a
T I F F (L Z W ) d e c o m p re s s o r
a re n e e d e d to s e e th i s p i c t u re .