Transcript Document

Tutorial 4:
Case Study
“Set Phasers on Stun”
SY DE 142 – June 7, 2004
Introduction to Human Systems Engineering
Waterloo, Ontario, Canada
Outline
 Case Study: “Set Phasers On Stun”
 Discussion on “Set Phasers on Stun”
 Midterm Overview
Set Phasers on Stun
Overview
 Time: 1986
 Place: East Texas Cancer Center, Tyler
 Synopsis: A computer glitch turns miracle
machine into monster for one cancer patient.
Mode error combine with lack of feedback
deliver a blast of 25,000 rads down onto the
patient.
Set Phasers on Stun
Interface Design
 Draw out the general Human machine system
model and redraw it for this case.
 What feedback was available to Mary Beth
and what was missing?
Set Phasers on Stun
“Information Displays”
Human = Mary Beth Interface = Therac Control Panel
Machine = Therac
World = Patient
a: mary beth's command
b: control signal
c: rays
d: patient state
e: feedback on Therac state and actions
f: feedback on interface state and actions
(control signal sent)
g: interface information
Set Phasers on Stun
Feedback
 Mary Beth needed to know:

the control signal was sent
the Therac mode
that the Therac had sent out rays
patient state.

14 marks: 5 for the draw, 5 for redraw and 4 for feedback



Midteram Overview
SY DE 142 Midterm:

Date: June 14, 2004
Time: 1:30 - 3:30pm

Room: DC 1350

Aids Allowed:

Text book: Wickens and Set Phasers on Stun
 Calculator


Solutions must be written in pen, not in pencil.
Case Studies
Business in Bhopal
Silent Warning
In Search of the Lost Cord
An Act of God
The Wizards of Wall Street
Set Phasers on Stun
Films
Death on the Job
Bhopal, a Lingering Tragedy
Why Planes Crash
Broken Bus
Course Material Outline
 Accident Analysis and
Fault Trees
 Mappings and Affordances
 Gulfs of Execution and
Evaluation
 Human Action Cycle
 Information Processing
 Human Decision Making
 Human Error - Mistakes





Human Error- slips
Human machine model
Displays
Control
Human-Computer
Interaction
 Usability Testing
 Automation
 More details on slides
and in book.
Accident Analysis and
Fault Trees
 Linear interactions
 Common mode interaction
 FMECA
 Fault Tree Analysis



 Nonlinear interactions


 Tight vs. Loose coupling
Chronological
Show causality
Events: action and time
(time often implicit)
AND/OR gates
Last event at the top
OR
AND
Mappings and Affordances
 Mapping : relation between action and its result in
the world


Helps automatic processing when extremely strong between
world and required action
Two kinds; natural (steering wheel), social/cultural (light switch)
 Affordance:

perceived and actual properties of things that help to direct users’
actions, should be applied as a design principal
“Affordances become visible by establishing
mappings, (what it does, how it works)”
Gulfs of Execution and
Evaluation (and HAC)
 Gulfs:


Execution: have an intention but can’t figure out
action (difference in seq of action & action in the
Human Action Cycle)
Evaluation: Can’t figure out whether the goal has
been achieved
HUMAN ACTION CYCLE
GOAL
Intention
Evaluate
Evaluate
Interpret Gulf of evaluation!
Act
Sequence of Actions
(what should be done)
Gulf of execution!
Perception
How is state of
the world
perceived? Use
senses
Act!
WORLD
Interpret
Information Processing
“How we Think”
 Memory

Short term, long term , how to improve, knowledge in
head vs. knowledge in world
 Perception

Feature analysis (bottom-up processing), unitization, top
down processing ----design implications
 Attention


Selective, divided ---- design implications
Resource model, Multiple resource model
More Information Processing
“How we Think”
 Situation awareness (SA): being aware of meanings
of dynamic changes in the environment


3 stages: Perceive, understand, predict
Measuring SA: by SA Global Assessment technique
(SAGAT)
 Decision making


Normative model (methods: multi-attribute utility theory,
expected value theory, SEUT)
Descriptive model (methods: satisfaction not optimal,
heuristics, and biases to create easier ways of thinking)
Human Decision Making
 Heuristics and Biases in Human decision making
(look at updated lecture notes)

could happen in any of the following stages:
1. Getting information input (input or cue biases)
2. Generating hypotheses and selection ( 6 biases).
3. Plan generation and action choice (4 biases).
 SRK Framework



Skill based decisions (automated)
Rule Based decisions (procedural)
Knowledge based decisions
Human Error -- mistake
 Mistake: wrong goal and intention but right
action


Why it happens?
Types of mistake
mistaken similarity,
 misjudged probability,
 rationalizing small events,
 social pressures/cultural factors and $

 Forcing Functions
Human Error -- slips
 Slip: right goal and intention but wrong
action,
Mostly occurs with skilled behavior
(WHY?)
 Mode Error: right action in wrong mode
(therefore the action becomes WRONG)

Information Displays
Human-Machine Model
 Human machine system
model :

user
Elements:
Interface
 Feedback: (4 feedbacks)


machine World


begins with Action :



Operator acts on the
interface.
Interface sends a control
signal to the machine.
Machine acts on the world.

State of world to interface
Action of machine to
interface
Indication of control signal
(machine to interface)
Information from interface
to operator
 Any missing item may
cause an accident
Display contents
 should permit evaluation and execution
 Display principles:




Perceptual (legible, give reference, redundancy, design
for distinctive features)
Mental model (pictorial, moving part, ecological)
Attention (multi-resource, proximity compatibility,
information access cost)
Memory (predictive aids, knowledge in the world,
consistency
Display forms
 Digital vs. Analog (precision vs. change)
 Configural displays


Rankine cycle
Polar star display
 Heads-up
 Ecological displays
Control
 Control vs. display : control is same as display till
user interacts with system through display
 Very important in design same guidelines as
displays.
 Laws and principals:


Hick-Hyman law for Reaction Time
Fitts law for Movement Time
 Control Types : zero order (mouse), first order
(steering wheel) and second order (thrust of shuttle)
Human-Computer Interaction
 What your focus is as a designer:

User group:


who is using your system (novice, infrequent, frequent expert)
and what should you know about these users.
Interaction styles:

how will the user (based on expertise) interact with the system
(eg. Menu, form, QA, command language, function keys, direct
manipulation, natural language, ….)
Usability and user testing

Usability Approaches (4)
 Cognitive walkthrough
 Heuristic evaluation (Neilson’s usability
principals)
 Performance measurement
 Field study
 Tasks
 Usability measures (satisfaction, learnability,
errors)
Automation
 When and why use automation
 Classes of automation
 Information
acquisition (warnings, filters)
 Information integration (pattern recognition,
expert systems)
 Action selection (TCAS)
 Action execution and control (autopilots,
cruise control)
Automation
 Levels of automation
 Reliability Issues:



complacency (over trust),
mistrust,
dumb and dutiful effect.
 Best form is Human Centered Automation
Good luck