ppt - CSE Labs User Home Pages

Download Report

Transcript ppt - CSE Labs User Home Pages

Evaluating User Interfaces Walkthrough Analysis

Joseph A. Konstan [email protected]

CSci 5115

October 10

   Introduction to Evaluation Cognitive Walkthrough Other Evaluation Methods 2

CSci 5115

Interface Development Methodology

  Prototype and Iterate  keep iterating until it is good enough  evaluate along the way to assess What is Good? What is Good Enough?

 set usability goals  should relate to tasks 3

CSci 5115

Casual Iteration

  Find major usability problems  missing features  user confusion  poor interaction Try interface with specific tasks  first use designers, then move towards users  observe overall usage 4

CSci 5115

Casual Iteration

   Remember the goal  don’t defend the interface  don’t bias the tests towards the interface If possible, allow user exploration  may even lead to capturing new tasks Consider alternative ways to fix a problem 5

CSci 5115

Limits of Casual Iteration

   Does not indicate when to stop Financial trade-offs Justification of delay 6

CSci 5115

Usability Goals and Measures

  Concrete, quantitative measures of usability  learning time  use time for specific tasks and users  error rates  measures of user satisfaction Comparative usability goals  compare with prior versions or competitors 7

CSci 5115

Things to Watch

   Goals should be realistic  100% is never realistic Many goals go beyond the application UI  training, manuals Testing goals should help improve the UI  detail--not just good/bad 8

CSci 5115

Exercise: Setting Usability Goals

 In project groups, come up with 2 usability goals for your project  discuss the feasibility of testing these goals  what is needed for the test  when in the process can they be tested?

 how much effort, user preparation/training, etc.?

 what would you learn from the test?

9

CSci 5115

Interface Evaluation

 Goals of interface evaluation  find problems  find opportunity for improvement  determine if interface is “good enough” 10

CSci 5115

With or Without Users

   Users are expensive and inconsistent  usability studies require several users  some users provide great information, others little Users are users  cannot be simulated perfectly Best choice--Both 11

CSci 5115

Evaluation Without Users

  Quantitative Methods  GOMS/keystroke analysis  back-of-the-envelope action analysis Qualitative Methods  expert evaluation  cognitive walkthrough  heuristic evaluation 12

CSci 5115

Walkthrough Analysis

  Economical interface evaluation  low-fidelity prototype  development team  users optional Effective, if  goal is improvement, not defense  some team members skilled  proper motivation 13

CSci 5115

Cognitive Walkthrough

  Goals  imagine user’s experience  evaluate choice-points in the interface  detect confusing labels or options  detect likely user navigation errors Start with a complete TCUID scenario  never try to “wing it” on a walkthrough 14

CSci 5115

Tell a Believable Story

   How does the user accomplish the task Action-by-action Based on user knowledge and system interface 15

CSci 5115

Best Approach

   Work as a group  don’t partition the task Be highly skeptical  remember the goal!

Every gap is an interface problem 16

CSci 5115

Who Should Do the Walkthrough

   Designers, as an early check Team of designers & users  remember: goal is to find problems  avoid making it a show Skilled UI people may be valuable team members 17

How Far Along

CSci 5115   Basic requirements  description or prototype of interface  know who users are (and their experience)  a task description  a list of actions to complete the task (scenario)  DO NOT try to create the action list on the fly!

Viable once the scenario and interface sketch are completed 18

How to Proceed

CSci 5115  For each action in the sequence  tell the story of why the user will do it  ask critical questions  will the user be trying to produce the effect?

 will the user see the correct control?

 will the user see that the control produces the desired effect?

 will the user select a different control instead?

 will the user understand the feedback to proceed correctly?

19

CSci 5115

Walkthroughs are not Perfect

  They won’t find every problem  limited by nature  new users who know what task they need to accomplish  biased towards correct action sequence  limited in implementation  hard to shed the expertise of evaluators A useful tool in conjunction with others 20

CSci 5115

Exercise: Cognitive Walkthrough Analysis

     In non-project groups of 3-5 Users and Task to be announced Scenario developed jointly Perform walkthrough  identify problems  estimate error probabilities (25% intervals) Remember who your users are!

21

CSci 5115

GOMS/Keystroke Analysis

   Formal action analysis  accurately predict task completion time for skilled users Break task into tiny steps  keystroke, mouse movement, refocus gaze  retrieve item from long-term memory Look up average step times  tables from large experiments 22

CSci 5115

GOMS/Keystroke Analysis

  Primary utility: repetitive tasks  e.g., telephone operators  benefit: can be very accurate (within 20%)  may identify bottlenecks Difficulties  challenging to decompose accurately  long/laborious process  not useful with non-experts 23

CSci 5115

Back-of-the-Envelope Action Analysis

  Coarse-grain  list basic actions (select menu item)  each action is at least 2-3 seconds  what must be learned/remembered?

 what can be done easily?

 documentation/training?

Goal is to find major problems  Example: 1950’s 35mm camera 24

CSci 5115

Expert Evaluation

   Usability specialists are very valuable  double-specialists are even better An inexpensive way to get a lot of feedback Be sure the expert is qualified in your area 25

CSci 5115

Looking Ahead

  Next week: Heuristic Evaluation Walkthroughs Due  “raw” notes • • • notes from each step of walkthrough copy of prototype used, markups copy of scenarios used (note changes or fixes)  processed results • 1-2 pages of issues identified, solutions not needed 26