Slides for Karin's talk
Download
Report
Transcript Slides for Karin's talk
Toolkit Support for Usability
Evaluation
05-830 Spring 2013 – Karin Tsai
1
Overview
•
•
•
•
Motivation
Definitions
Background from Literature
Examples of Modern Tools
2
Motivation
•
•
•
•
•
To improve or validate usability
Comparison between products, AB tests, etc.
Measuring progress
Verify adherence to guidelines or standards
Discover features of human cognition
3
Usability Attributes
•
•
•
•
•
Learnability – easy to learn
Efficiency – efficient to use
Memorability – easy to remember how to use
Errors – low error rate; easy to recover
Satisfaction – pleasant to use and likable
4
Evaluation Categories
• Predictive
– psychological modeling techniques
– design reviews
• Observational
– observations of users interacting with the system
• Participative
– questionnaires
– interviews
– “think aloud” user-testing
5
Challenges and Tradeoffs
• Quality vs. Quantity
– “Quality” defined as abstraction, interpretability, etc.
– User testing – high quality; low quantity
– Counting mouse clicks – low quality; high quantity
• Observing Context
• Abstraction
– Event reporting in applications places burden on developers
– Complicates software evolution
6
CogTool
• Evaluation Type: Predictive
• Description: Uses a predictive human performance
model (“cognitive crash dummy”) to evaluate designs.
7
CogTool
Pros
Cons
Free
Limited in “realisticness”
Good for getting a baseline evaluation of
prototypes
Quite confusing at first (extremely high
learning curve)
Instantly accessible (not limited by
participant availability or completion of
the system’s functionality)
Documentation is “daunting”
Neat concept and insight into human
cognition
Limited usefulness
Overall Score: 6.5/10
8
Mixpanel
• Evaluation Type: Observational
• Description: Aggregates developer-defined event data
in useful ways.
9
Mixpanel
Pros
Cons
Very powerful built-in analysis tools
High learning curve
Good API for automated scripting
Expensive
Scalable
Application events = developer
burden/maintainability issues
Flexible to fit needs of developers
Rate-limited (one request at a time)
Overall Score: 9.5/10
10
Chartbeat
• Evaluation Type: Observational
• Description: Real-time data visualization.
11
Chartbeat
Pros
Cons
Data is real-time
Does not scale well (financially) with huge
sites
Captures data that is hard to obtain via
events (reading, writing, idling, active
time, referrals, social integration, etc.)
Limited in the data it captures (have to
“hack” it if you want event-like data)
Great for site monitoring
Only records “page-level” interactions
Really awesome visualization
Limited historical data access
Easy to use
Not built for usability evaluation
Overall Score: 7/10
12
User Testing
• Evaluation Type: Participative
• Description: Watch a user complete a task on your
system while thinking aloud.
13
User Testing
Pros
Cons
Probably best method for catching
usability issues
Small sample size (hit or miss)
Most thorough recording of user
interaction with the system
Not easily scalable (expensive)
“Think aloud” allows data insights not
otherwise attainable from just user
interactions
Limited user availability
Can observe certain demographics
without requesting personal information
in the system itself
Sometimes, it’s painful to watch…
Overall Score: 8.5/10
14
Questions?
15