SIG-600: Usability Test of YFG-PBI

Download Report

Transcript SIG-600: Usability Test of YFG-PBI

HCI460: Week 4 Lecture
September 30, 2009
Outline
 Study logistics
 Facility and equipment
 Creating a moderator’s guide
 Guest lecture on global UX research
– Bob Schumacher from User Centric, Inc.
 Project 1b recap
 Assignment for next week (Project 2b)
© July 20, 2015
2
Study Logistics
© July 20, 2015
3
Study Logistics
Team Member Selection
 Not always N=1 effort
 Skills assessment
 Project nuisances
– Some projects fit some team members better than others
– Think on feet
– High profile projects
 Estimate time commitment
– Prep for moderators
– Note takers
– Facility reservation (prep + data collection)
• Lab technician time
– Could even have interpreters, etc.
– Prototype technicians
© July 20, 2015
4
Study Logistics
Schedule
 Sessions per day
– Often, day 1 schedule is
different from subsequent
test days
Start
Finish
Activity
9:00 AM
-
10:00 AM
Session 1
10:00 AM
-
10:15 AM
Review
10:15 AM
-
11:15 AM
Session 2
 Breaks
– Moderator (not really)
– Review (definitely)
– Participant delays
11:15 AM
-
12:15 PM
Session 3
12:15 PM
-
1:00 PM
Lunch
1:00 PM
-
2:00 PM
Session 4
2:00 PM
-
3:00 PM
Session 5
3:00 PM
-
3:15 PM
Review
3:15 PM
-
4:15 PM
Session 6
 Lunch
– Timing for moderator
4:15 PM
-
5:15 PM
Session 7
 Attendees can plan
© July 20, 2015
5
Study Logistics
A Moment on Attendees
 For high profile projects, attendance by senior executives can occur
 What are considerations?
– Expect delays and fire drills
– Stakeholders will be nervous
– Maintain poise and control
– Many like to attend on Day 1, Participant 1
• Be smart and proactive with schedule
 What are the risks?
– Single sessions
– Remedies:
• Floaters
• Simultaneous sessions
© July 20, 2015
6
Facility and Equipment
© July 20, 2015
7
Facility and Equipment
Test Room Configuration
 Assuming one-on-one sessions,
test rooms can be configured:
– Side-by-side
– Separated (moderator and
participant are separated by
a wall)
– Remote moderated sessions
 Pros/Cons?
 Does the artifact tested matter?
– Consider devices or cockpit
simulators
© July 20, 2015
8
Facility and Equipment
Facility
 One-way mirror between test and
observation room
– Is it really necessary?
 Screen capture
– Morae or equivalent
– Other products
 Physical devices
© July 20, 2015
9
Facility and Equipment
UC Lab Camera in 2005
July 20, 2015
10
Facility and Equipment
UC Lab Camera in 2006
July 20, 2015
11
Facility and Equipment
UC Lab Camera in 2007
2005
July 20, 2015
12
Facility and Equipment
UC Lab Camera in 2008
July 20, 2015
13
Facility and Equipment
UC Lab Camera in 2009
July 20, 2015
14
Facility and Equipment
Using Internal Facilities
 Key factors to consider?
– Security
– Escort
– Reception
– Waiting area
– Compensation
– Technical failures
© July 20, 2015
15
Facility and Equipment
Mentally Prepare for Problems
 What you can control is ~50%
– Technology
– Participants
• No shows
• Bad participants
– Client changes of direction
© July 20, 2015
16
Creating a Moderator’s Guide
© July 20, 2015
17
Creating a Moderator’s Guide
Iterative Process
 Moderator’s guides should be developed through an iterative
process
– Benefits?
 First draft based on test plan
– Check guide against objectives
– Assess time
 Review with stakeholders
 Iterate guide
 Pilot test
 Inform stakeholders
 Finalize
© July 20, 2015
18
Creating a Moderator’s Guide
Moderator's Guide: Elements
 Consent forms
 Introduction
 Warm-up questions
 Tasks
– Besides core use cases, some projects can use a first
impressions tasks
• Out-of-the-Box
• Devices
– Number of tasks varies
 Wrap-up questions
© July 20, 2015
19
Creating a Moderator’s Guide
Consent Forms
 What elements should be in a consent form?
– Brief (1-2 sentence) explanation of overall study
– Right to leave with compensation
– Inform about video taping (if applicable)
– Non-disclosure agreement coupled with exact compensation
– Information will remain anonymous
– Signature, date
 Any other special considerations/situations?
– Include in form
 Under 18 years
– Parental consent
© July 20, 2015
20
Creating a Moderator’s Guide
Introduction
 Rubin suggests the importance of reading the script verbatim vs.
memorizing
– Does not mean monotone dictation
– Eye contact
 Key elements:
– Introduce yourself
– Explain why the participant is here
– Describe the testing environment (e.g., video recording)
– Describe artifact to be tested
– Remind participant that he/she is not being tested
 Other elements?
– Put participant at ease
• Involvement in design / Colleague in other room
– Artifact nuisances (e.g., prototype testing)
© July 20, 2015
21
Creating a Moderator’s Guide
Warm-up Questions
 Confirms the participant screener (can also be partially filled out in
waiting area)
 Clarifies some key qualifiers
 Establishes context that can be used in session
 What do you do when participant does not fit?
© July 20, 2015
22
Creating a Moderator’s Guide
Tasks
 State objective (initial drafts)
– Why are we even asking the question?!?!?!
 Make tasks as realistic as possible
 Include motivations and context to perform for participant
 Choose task order wisely
 Match tasks to participant experience
 Avoid jargon or cues
 Provide sufficient amount of “work” to be done
– Even if the task will be stopped due to prototype or intervention
© July 20, 2015
23
Creating a Moderator’s Guide
First Impressions Task
 Benefits
– Initial perception
• Usable / not usable
• Like / dislike
– Assess visual affordances
 Cons
– User spends more time than “normal” to inspect
© July 20, 2015
24
Creating a Moderator’s Guide
Who Should Read the Tasks?
 Moderator reads task aloud?
 Participant reads task aloud?
© July 20, 2015
25
Creating a Moderator’s Guide
When Should “Help” Be Offered?
 Establish criteria for when help is administered
– State type of help (consistency here is more important than intro)
– Count number
– Difficulties of this practice?
© July 20, 2015
26
Creating a Moderator’s Guide
Training / Learning
 How much training should be provided?
 Objectives?
 Written material?
– Actual guides?
– Test WIP
– Competitive studies require a leveling of the playing field
 Verbal teaching or demonstration?
© July 20, 2015
27
Creating a Moderator’s Guide
Tradeoffs
 What do you do when the guide has too much “stuff” or takes too
long?
– Inform stakeholders
– Prioritize tasks
– Assess pre- and post-task questions/metrics
– Recognize that as moderators, you will get faster after pilot
– Consider moving some self-report questions
• Away from warm-up questions and into waiting area
• Away from wrap-up in session and into waiting area
• But, be aware of data loss
© July 20, 2015
28
Creating a Moderator’s Guide
Data Collection
 Should you think about it in conjunction with creation of moderator’s
guide?
 What are techniques to collect data?
– Write on moderator’s guide
– Blank paper
– Checkboxes to questions vs. free form
– Path
 Who does one collect quotes?
 Interaction with note-taker, if applicable
 Is there a “right” answer?
© July 20, 2015
29
Project 1b Recap
© July 20, 2015
30
Project 1b Recap
Grading Sheet
 EXECUTIVE SUMMARY AND INTRODUCTION (__ out of 4 points)
Criteria
Executive summary summarizes the contents of the report
Yes / No Comments
Introduction appropriately describes the following:
- Product evaluated
- Objectives
- Target users
- Context of use
- Evaluation method
© July 20, 2015
31
Project 1b Recap
Grading Sheet
 FINDINGS (__ out of 5 points)
Criteria
Correct evaluation method (i.e., expert evaluation) was used to
generate findings
Report contains a sufficient number of findings
Yes / No Comments
Findings are organized in a way that makes sense instead of being listed
in a random order
Positive findings are included as well as usability issues
It is clear to which part of the interface each finding corresponds
Severity ratings are easy to understand
Appropriate severity ratings accompany each usability issue
Descriptions of the findings are appropriate, precise, and concise
Each finding is justified (i.e., it is obvious to an non-expert why the issue
may cause users problems)
© July 20, 2015
32
Project 1b Recap
Grading Sheet
 RECOMMENDATIONS (__ out of 4 points)
Criteria
Recommendations accompany the findings
Yes / No Comments
Recommendations appropriately address the findings
Recommendations are specific and actionable
© July 20, 2015
33
Project 1b Recap
Grading Sheet
 QUALITY OF PRESENTATION (__ out of 2 points)
Criteria
Report is well structured, well laid out, visually pleasing, and easy to
read
Language used throughout the report is professional
Yes / No Comments
Report is free of grammatical and spelling errors
© July 20, 2015
34
Project 1b Recap
Points (out of 15)
15
13.5
13.75
14.25
12.5
12
9
8
8.5
9
9.5
10
6
3.5
3
2
0
© July 20, 2015
35
Project 1b Recap
Groups











Group 1: Dietz, Dulski, Dzienisowicz, Remington
Group 2: Birdseye, Dash, Gamboa, Welense
Group 3: Elliott, Petlick
Group 4: Berberian, Devlin, Shah, Sienkiewicz
Group 5: Albarracin, Ginez, Schwarz
Group 6: Epps, Jones, Lund
Group 7: Barbera, Schulte, Schwantes, Young
Group 8: Canady, Doshi, Roberts
Group 9: Garcia, Goldberg, Haines, Wickenkamp
Group 10: Diemer, Komosa, Ranguelov, Young
Group 11: Cheng, Freeman, Taylor
© July 20, 2015
36
Assignment for Next Week
© July 20, 2015
37
Assignment for Next Week
Next Week…
 Project 2b
– Develop the screener.
– Develop the moderator’s guide.
• Assume 20 minute usability testing sessions.
 Read Rubin, Chapter 9: Conduct the Test Sessions.
© July 20, 2015
38