Chapter X: Put chapter title here

Download Report

Transcript Chapter X: Put chapter title here

Chapter 12:
Introducing Evaluation
The aims
• To illustrate how observation, interviews and
questionnaires that you encountered in
Chapters 7 and 8 are used in evaluation.
• To explain the key concepts and terms used in
evaluation.
• To introduce three main evaluation evaluation
approaches and key evaluation methods within
the context of real evaluation studies.
Six evaluation case studies
• Evaluating early design ideas for a mobile
device for rural nurses in India.
• Evaluating cell phones for different markets.
• Evaluating affective issues: challenge and
engagement in a collaborative immersive game.
• Improving a design: the HutchWorld patient
support system.
• Multiple methods help ensure good usability:
the olympic messaging system (OMS).
• Evaluating a new kind of interaction: an
ambient system.
Why, what, where and when
to evaluate
•
•
•
•
Iterative design & evaluation is a continuous
process that examines:
Why: to check that users can use the product and
that they like it.
What: a conceptual model, early prototypes of a
new system and later, more complete prototypes.
Where: in natural and laboratory settings.
When: throughout design; finished products can be
evaluated to collect information to inform new
products.
Designers need to check that they understand
users’ requirements.
Bruce Tognazzini tells you why
you need to evaluate
“Iterative design, with its repeating cycle of
design and testing, is the only validated
methodology in existence that will consistently
produce successful results. If you don’t have
user-testing as an integral part of your design
process you are going to throw buckets of
money down the drain.”
See AskTog.com for topical discussions about
design and evaluation.
The language of evaluation
• Analytical evaluation
• Controlled
experiment
• Field study
• Formative evaluation
• Heuristic evaluation
• Predictive evaluation
• Summative
evaluation
• Usability laboratory
• User studies
• Usability studies
• Usability testing
• User testing
Evaluation approaches
• Usability testing
• Field studies
• Analytical evaluation
• Combining approaches
• Opportunistic evaluations
Characteristics of approaches
Usability
testing
Field
studies
Analytical
Users
do task
natural
not involved
Location
controlled
natural
anywhere
When
prototype
early
prototype
Data
quantitative
qualitative
problems
Feed back
measures &
errors
descriptions
problems
Type
applied
naturalistic
expert
Evaluation approaches and
methods
Method
Usability
testing
Field
studies
Observing
x
x
Asking
users
x
x
Asking
experts
Testing
Modeling
x
Analytical
x
x
x
Evaluation to design a mobile
record system for Indian AMWs
• A field study using observations and
interviews to refine the requirements.
• It would replace a paper system.
• It had to be easy to use in rural
environments.
• Basic information would be recorded:
identify each house-hold, head of house,
no. members, age and medical history
of members, etc.
Could these icons be used
with other cultures?
For more interesting examples of mobile
designs for the developing world
see Gary Marsden’s home page:
http://people.cs.uct.ac.za/~gaz/research.html
Evaluating cell phones for
different world markets
• An already existing product was used as
a prototype for a new market.
• Observation and interviews.
• Many practical problems needed to be
overcome: Can you name some?
• Go to www.nokia.com
and select a phone or
imagine evaluating
this one in a country
that Nokia serves.
QuickTime™ and a
TIFF (Uncompressed) decompressor
are needed to see this picture.
Challenge & engagement in a
collaborative immersive game
• Physiological measures
were used.
• Players were more
engaged when playing
against another
person than when
playing against a
computer.
• What were the
precautionary
measures that the
evaluators had to
take?
What does this data tell you?
h i gh valu e s indi catemore vari ati on
Playing against
computer
Boring
Challenging
Easy
Engaging
Exciting
Frustrating
Fun
Playing against
friend
Mean
St. Dev.
Mean
St. Dev.
2.3
3.6
2.7
3.8
3.5
2.8
3.9
0.949
1.08
0.823
0.422
0.527
1.14
0.738
1.7
3.9
2.5
4.3
4.1
2.5
4.6
0.949
0.994
0.850
0.675
0.568
0.850
0.699
Source: Mandryk and Inkpen (2004).
The HutchWorld patient
support system
• This virtual world supports
communication among
cancer patients.
• Privacy, logistics, patients’
feelings, etc. had to be
taken into account.
• Designers and patients
speak different languages.
• Participants in this world
can design their own avatar.
Look at the “My
appearance” slide that
follows. How would you
evaluate it?
My Appearance
Multiple methods to evaluate
the 1984 OMS
•









Early tests of printed scenarios & user guides.
Early simulations of telephone keypad.
An Olympian joined team to provide feedback.
Interviews & demos with Olympians outside US.
Overseas interface tests with friends and family.
Free coffee and donut tests.
Usability tests with 100 participants.
A ‘try to destroy it’ test.
Pre-Olympic field-test at an international event.
Reliability of the system with heavy traffic.
Something to think about
• Why was the design of the OMS a
landmark in interaction design?
• Today cell phones replace the need for
the OMS. What are some of the benefits
and losses of cell phones in this
context? How might you compensate for
the losses that you thought of?
Evaluating an ambient
system
• The Hello Wall is a
new kind of system
that is designed to
explore how people
react to its presence.
• What are the
challenges of
evaluating systems
like this?
Key points
 Evaluation & design are closely integrated in user-centered
design.
 Some of the same techniques are used in evaluation as for
establishing requirements but they are used differently
(e.g. observation interviews & questionnaires).
 Three main evaluation approaches are:
usability testing, field studies, and analytical evaluation.
 The main methods are:observing, asking users, asking
experts, user testing, inspection, and modeling users’ task
performance.
 Different evaluation approaches and methods are often
combined in one study.
 Triangulation involves using a combination of techniques to
gain different perspectives, or analyzing data using different
techniques.
 Dealing with constraints is an important skill for evaluators
to develop.
A project for you …
• “The Butterfly Ballot: Anatomy of
disaster” is an interesting account written
by Bruce Tognazzini, that you can find by
going to AskTog.com and looking through
the 2001 column.
• Alternatively go directly to:
http://www.asktog.com/columns/042Butt
erflyBallot.html
A project for you …
continued
• Read Tog’s account and look at the picture
of the ballot card.
• Make a similar ballot card for a class
election and ask 10 of your friends to vote
using the card. After each person has voted
ask who they intended to vote for and
whether the card was confusing. Note down
their comments.
• Redesign the card and perform the same
test with 10 different people.
• Report your findings.