Handbook of Usability Testing -  How to Plan Design and Conduct Effective Tests - 2nd edition

Contents

Acknowledgments xi

Foreword xxix

Preface to the Second Edition xxxiii

Part One Usability Testing: An Overview
Chapter 1 What Makes Something Usable? 3
What Do We Mean by 섃쁕sable쇺? 4
What Makes Something Less Usable? 6
Five Reasons Why Products Are Hard to Use 6
Reason 1: Development Focuses on the Machine or System 7
Reason 2: Target Audiences Expand and Adapt 8
Reason 3: Designing Usable Products Is Difficult 9
Reason 4: Team Specialists Don셳 AlwaysWork in
Integrated Ways 9
Reason 5: Design and Implementation Don셳 Always
Match 11
WhatMakes Products MoreUsable? 12
An Early Focus on Users and Tasks 13
Evaluation and Measurement of Product Usage 13
Iterative Design and Testing 14
Attributes of Organizations That Practice UCD 14
Phases That Include User Input 14
A Multidisciplinary Team Approach 14
Concerned, Enlightened Management 15
A 섃쁋earn as You Go쇺 Perspective 15
Defined Usability Goals and Objectives 16
What Are Techniques for Building in Usability? 16
Ethnographic Research 16
Participatory Design 17
Focus Group Research 17
Surveys 17
Walk-Throughs 18
Open and Closed Card Sorting 18
Paper Prototyping 18
Expert or Heuristic Evaluations 19
Usability Testing 19
Follow-Up Studies 20

Chapter 2 What Is Usability Testing? 21
Why Test? Goals of Testing 21
Informing Design 22
Eliminating Design Problems and Frustration 22
Improving Profitability 22
Basics of the Methodology 23
Basic Elements of Usability Testing 25
Limitations of Testing 25

Chapter 3 When Should You Test? 27
Our Types of Tests: An Overview 27
Exploratory or Formative Study 29
When 29
Objective 29
Overview of the Methodology 30
Example of Exploratory Study 32
Assessment or Summative Test 34
When 34
Objective 34
Overview of the Methodology 35
Validation or Verification Test 35
When 35
Objective 35
Overview of the Methodology 36
Comparison Test 37
When 37
Objective 37
Overview of the Methodology 38
Iterative Testing: Test Types through the Lifecycle 39
Test 1: Exploratory/Comparison Test 39
The situation 39
Main Research Questions 40
Brief Summary of Outcome 41
Test 2: Assessment Test 41
The Situation 41
Main Test Objectives 41
Brief Summary of Test Outcome 42
Test 3: Verification Test 42
The Situation 42
Test Objectives 43
Brief Summary of Test Outcome 43

Chapter 4 Skills for Test Moderators 45
Who Should Moderate? 45
Human Factors Specialist 46
Marketing Specialist 46
Technical Communicator 47
Rotating Team Members 47
External Consultant 47
Characteristics of a Good Test Moderator 48
Grounding in the Basics of User-Centered Design 48
Quick Learner 48
Instant Rapport with Participants 49
Excellent Memory 49
Good Listener 49
Comfortable with Ambiguity 50
Flexibility 50
Long Attention Span 51
Empathic 섃쁏eople Person쇺 51
섃쁁ig Picture쇺 Thinker 51
Good Communicator 52
Good Organizer and Coordinator 52
Getting the Most out of Your Participants 52
Choose the Right Format 53
Sit-By Sessions versus Observing from Elsewhere 53
섃쁔hink-Aloud쇺 Advantages and Disadvantages 54
Retrospective Review 54
Give Participants Time to Work through Hindrances 55
Offer Appropriate Encouragement 55
Troubleshooting Typical Moderating Problems 56
Leading Rather than Enabling 57
Too Involved with the Act of Data Collection 57
Acting Too Knowledgeable 57
Too Rigid with the Test Plan 58
Not RelatingWell to Each Participant 58
Jumping to Conclusions 58
How to Improve Your Session-Moderating Skills 58
Learn the Basic Principles of Human Factors/Ergonomics 59
Learn from Watching Others 59
Watch Yourself on Tape 59
Work with a Mentor 59
Practice Moderating Sessions 60
Learn to Meditate 60
Practice 섃쁁are Attention쇺 61

Part Two The Process for Conducting a Test
Chapter 5 Develop the Test Plan 65
Why Create a Test Plan? 65
It Serves as a Blueprint for the Test 66
It Serves as the Main Communication Vehicle 66
It Defines or Implies Required Resources 66
It Provides a Focal Point for the Test and a Milestone 66
The Parts of a Test Plan 67
Review the Purpose and Goals of the Test 67
When Not to Test 68
Good Reasons to Test 69
Communicate Research Questions 69
Summarize Participant Characteristics 72
Describe the Method 73
Independent Groups Design or Between Subjects Design 75
Within-Subjects Design 75
Testing Multiple Product Versions 76
Testing Multiple User Groups 77
List the Tasks 79
Parts of a Task for the Test Plan 79
Tips for Developing the Task List 82
Example Task: Navigation Tab on aWeb Site 83
Ways to Prioritize Tasks 85
Describe the Test Environment, Equipment, and Logistics 87
ExplainWhat the Moderator Will Do 87
List the Data You Will Collect 88
Sample Performance Measures 88
Qualitative Data 90
Sample Preference Measures 90
Describe How the Results Will Be Reported 90
Sample Test Plan 91

Chapter 6 Set Up a Testing Environment 93
Decide on a Location and Space 94
In a Lab or at the User셲 Site? 94
Test in Multiple Geographic Locations? 96
Arranging Sessions at a User셲 Site 98
Minimalist Portable Test Lab 100
Setting up a Permanent or Fixed Test Lab 101
Simple Single-Room Setup 101
Modified Single-Room Setup 103
Large Single-Room Setup 105
Electronic Observation Room Setup 107
Classic Testing Laboratory Setup 108
Recommended Testing Environment: Minimalist
Portable Lab 110
Gather and Check Equipment, Artifacts, and Tools 111
Basic Equipment, Tools, and Props 111
Gathering Biometric Data 112
Identify Co-Researchers, Assistants, and Observers 112
Data Gatherer/Note Taker 112
Timekeeper 113
Product/Technical Expert(s) 113
Additional Testing Roles 113
Test Observers 113

Chapter 7 Find and Select Participants 115
Characterize Users 115
Visualize the Test Participant 116
Differentiate between Purchaser and End User 116
Look for Information about Users 117
Requirements and Specification Documents 117
Structured Analyses or Marketing Studies 118
Product Manager (R&D) 118
Product Manager (Marketing) 118
Competitive Benchmarking and Analysis Group 119
Define the Criteria for Each User Group 119
Define Expertise 119
Specify Requirements and Classifiers for Selection 121
Document the User Profile 122
Divide the User Profile into Distinct Categories 124
Consider a Matrix Test Design 125
Determine the Number of Participants to Test 125
Write the Screening Questionnaire 126
Review the Profile to Understand Users Backgrounds 127
Identify Specific Selection Criteria 127
Formulate Screening Questions 128
Organize the Questions in a Specific Order 129
Develop a Format for Easy Flow through the Questionnaire 130
Test the Questionnaire on Colleagues and Revise It 131
Consider Creating an 섃쁀nswer Sheet쇺 131
Find Sources of Participants 131
Internal Participants 132
Qualified Friends and Family 134
Web Site Sign-Up 134
Existing Customers from In-House Lists 135
Existing Customers through Sales Representatives 136
User Groups or Clubs, Churches, or Other Community
Groups 136
Societies and Associations 137
Referrals from Personal Networks, Coworkers, and Other
Participants 137
Craigslist 138
College Campuses 139
Market Research Firms or Recruiting Specialists 140
Employment Agencies 141
Newspaper Advertisements 142
Screen and Select Participants 143
Screening Considerations 143
Use the Questionnaire or Open-Ended Interview
Questions? 143
Complete the Screener Always, or OnlyWhen Fully
Qualified? 144
Conduct Screening Interviews 145
Inform the Potential Participant Who You Are 145
Explain Why You are Calling and How You Got the
Contact Information 145
Go through the Questions in the Questionnaire 145
As You Eliminate or Accept People, Mark Them Off on
Your List 146
Include a Few Least Competent Users in Every Testing
Sample 146
Beware of Inadvertently Testing Only the 섃쁁est쇺 People 147
Expect to Make Tradeoffs 148
Schedule and Confirm Participants 148
Compensate Participants 150
Protect Participants Privacy and Personal Information 151

Chapter 8 Prepare Test Materials 153
Guidelines for Observers 154
Orientation Script 155
Keep the Tone of the Script Professional, but Friendly 156
Keep the Speech Short 156
Plan to Read the Script to Each Participant Verbatim 157
Write the Orientation Script Out 158
Make Introductions 159
Offer Refreshments 159
Explain Why the Participant Is Here 159
Describe the Testing Setup 160
Explain What Is Expected of the Participant 160
Assure the Participant That He or She Is Not Being Tested 161
Explain Any Unusual Requirements 161
Mention That It Is Okay to Ask Questions at Any Time 161
Ask for Any Questions 161
Refer to Any Forms That Need Be Completed and Pass
Them Out 161
Background Questionnaire 162
Focus on Characteristics That May Influence Performance 163
Make the Questionnaire Easy to Fill Out and Compile 163
Test the Questionnaire 163
Decide How to Administer the Questionnaire 163
Data Collection Tools 165
Review the Research Question(s) Outlined in Your Test Plan 167
Decide What Type of Information to Collect 167
Select a Data Collection Method 168
Fully Automated Data Loggers 168
Online Data Collection 169
User-Generated Data Collection 169
Manual Data Collection 170
Other Data Collection Methods 170
Nondisclosures, Consent Forms, and Recording Waivers 173
Pre-Test Questionnaires and Interviews 174
Discover Attitudes and First Impressions 175
Learn about Whether Participants Value the Product 177
Qualify Participants for Inclusion into One Test Group or
Another 179
Establish the Participant셲 Prerequisite Knowledge Prior to
Using the Product 181
Prototypes or Products to Test 181
Task Scenarios 182
Provide Realistic Scenarios, Complete with Motivations to
Perform 183
Sequence the Task Scenarios in Order 183
Match the Task Scenarios to the Experience of the Participants 184
Avoid Using Jargon and Cues 184
Try to Provide a Substantial Amount of Work in Each
Scenario 184
Give Participants the Tasks to Do 185
Reading Task Scenarios to the Participants 185
Letting the Participants Read Task Scenarios Themselves 186
Optional Training Materials 187
Ensure Minimum Expertise 187
Get a View of the User after Experiencing the Product 188
You Want to Test Features for Advanced Users 189
What Are the Benefits of Prerequisite Training? 190
You Can Conduct a More Comprehensive, Challenging
Usability Test 190
You Can Test Functionality That Might Otherwise Get
Overlooked During a Test 190
Developing the Training Forces You to Understand How
Someone Learns to Use Your Product 191
Some Common Questions about Prerequisite Training 191
Post-Test Questionnaire 192
Use the Research Questions(s) from the Test Plan as the Basis
for Your Content 193
Develop Questionnaires That Will Be Distributed Either
during or after a Session 193
Ask Questions Related to ThatWhich You Cannot Directly
Observe 193
Develop the Basic Areas and Topics You Want to Cover 195
Design the Questions and Responses for Simplicity and
Brevity 196
Use the Pilot Test to Refine the Questionnaire 196
Common Question Formats 197
Likert Scales 197
Semantic Differentials 197
Fill-In Questions 198
Checkbox Questions 198
Branching Questions 198
Debriefing Guide 199

Chapter 9 Conduct the Test Sessions 201
Guidelines for Moderating Test Sessions 202
Moderate the Session Impartially 202
Be Aware of the Effects of Your Voice and Body Language 203
Treat Each New Participant as an Individual 203
If Appropriate, Use the 섃쁔hinking Aloud쇺 Technique 204
Advantages of the 섃쁔hinking Aloud쇺 Technique 204
Disadvantages of the 섃쁔hinking Aloud쇺 Technique 205
How to Enhance the 섃쁔hinking Aloud쇺 Technique 205
Probe and Interact with the Participant as Appropriate 206
Stay Objective, But Keep the Tone Relaxed 209
Don셳 섃쁒escue쇺 Participants When They Struggle 209
If You Make a Mistake, Continue On 210
Ensure That Participants Are Finished Before Going On 210
Assist the Participants Only as a Last Resort 211
When to Assist 211
How to Assist 212
Checklists for Getting Ready 213
Checklist 1: A Week or So Before the Test 214
Take the Test Yourself 214
Conduct a Pilot Test 215
Revise the Product 215
Check Out All the Equipment and the Testing
Environment 216
Request a Temporary 섃쁅reeze쇺 on Development 216
Checklist 2: One Day Before the Test 216
Check that the Video Equipment is Set Up and Ready 216
Check that the Product, if Software or Hardware, is
Working 217
Assemble AllWritten Test Materials 217
Check on the Status of Your Participants 217
Double-Check the Test Environment and Equipment 217
Checklist 3: The Day of the Test 217
Prepare Yourself Mentally 218
Greet the Participant 219
Have the Participant Fill Out and Sign Any Preliminary
Documents 220
Read the Orientation Script and Set the Stage 220
Have the Participant Fill Out Any Pretest Questionnaires 220
Move to the Testing Area and Prepare to Test 220
Start Recordings 221
Set Decorum for Observers in the Room 221
Provide Any Prerequisite Training if Your Test Plan
Includes It 223
Either Distribute or Read the Written Task Scenario(s) to
the Participant 224
Record Start Time, Observe the Participant, and Collect All
Critical Data 224
Have the Participant Complete All Posttest Questionnaires 224
Debrief the Participant 224
Close the Session 224
Organize Data Collection and Observation Sheets 225
Debrief with Observers 225
Provide Adequate Time Between Test Sessions 225
Prepare for the Next Participant 225
When to Intervene 225
When to Deviate from the Test Plan 226
What Not to Say to Participants 227

Chapter 10 Debrief the Participant and Observers 229
Why Review with Participants and Observers? 229
Techniques for Reviewing with Participants 230
Where to Hold the Participant Debriefing Session 231
Basic Debriefing Guidelines 231
Advanced Debriefing Guidelines and Techniques 235
섃쁒eplay the Test쇺 Technique 235
The Manual Method 235
The Video Method 236
Audio Record the Debriefing Session 236
Reviewing Alternate Designs 236
섃쁗hat Did You Remember?쇺 Technique 236
섃쁃evil셲 Advocate쇺 Technique 238
How to Implement the 섃쁃evil셲 Advocate쇺 Technique 238
Example of the 섃쁃evil셲 Advocate쇺 Technique 239
Reviewing and Reaching Consensus with Observers 241
Why Review with Observers? 241
Between Sessions 241
At the End of the Study 243

Chapter 11 Analyze Data and Observations 245
Compile Data 246
Begin Compiling Data as You Test 247
Organize Raw Data 248
Summarize Data 249
Summarize Performance Data 249
Task Accuracy 249
Task Timings 250
Summarize Preference Data 254
Compile and Summarize Other Measures 256
Summarize Scores by Group or Version 256
Analyze Data 258
Identify Tasks That Did Not Meet the Success Criterion 258
Identify User Errors and Difficulties 260
Conduct a Source of Error Analysis 260
Prioritize Problems 261
Analyze Differences between Groups or Product Versions 264
Using Inferential Statistics 265

Chapter 12 Report Findings and Recommendations 269
What Is a Finding? 269
Shape the Findings 269
Draft the Report 271
WhyWrite a Report? 273
Organize the Report 273
Executive Summary 274
Method 274
Results 275
Findings and Recommendations (Discussion) 275
Develop Recommendations 277
Focus on Solutions That Will Have the Widest Impact 278
Ignore Political Considerations for the First Draft 280
Provide Both Short-Term and Long-Term Recommendations 280
Indicate AreasWhere Further Research Is Required 281
Be Thorough 281
Make Supporting Material Available to Reviewers 282
Refine the Report Format 283
Create a Highlights Video or Presentation 283
Cautions about Highlights 284
Steps for Producing a Highlights Video 285
Consider the Points You Want to Make 286
Set up a Spreadsheet to Plan and Document the Video 286
Pick the Clips 286
Review Timing and Organization 287
Draft Titles and Captions 288
Review andWrap 288

Part Three Advanced Techniques

Chapter 13 Variations on the Basic Method 293
Who? Testing with Special Populations 293
People Who Have Disabilities 293
Scheduling and Reminding 295
During the Session 295
Older Adults 295
Scheduling and Reminding 296
During the Session 297
Children 298
Scheduling and Reminding 298
During the Session 299
What: Prototypes versus Real Products 299
Paper and Other Low-Fi Prototypes 300
Clickable or Usable Prototypes 301
How? Techniques for Monitored Tests 302
Flexible Scripting 303
What You Get 303
How to Use It 303
Gradual Disclosure or Graduated Prompting 304
What You Get 304
How to Use It 305
Co-Discovery (Two Participants at a Time) 306
What You Get 306
How to Use It 307
Alpha or Beta Testing with Favored Clients 307
What You Get 307
How to Use It 308
Play Tests 308
What You Get 309
How to Use It 309
Where? Testing Outside a Lab 309
Remote Testing 310
What You Get 310
How to Use It 310
Automated Testing 311
What You Get 311
How to Use It 311
Testing In-Home or On-Site 312
What You Get 312
How to Use It 312
Self-Reporting (Surveys, Diary Studies) 313
What You Get 313
How to Use It 313

Chapter 14 Expanding from Usability Testing to Designing the User Experience 315
Stealth Mode: Establish Value 316
Choose the First Project Carefully 317
Begin Your Education 317
Start Slowly and Conservatively, Get Buy-In 320
Volunteer Your Services 321
Create a Strategy and Business Case 321
Build on Successes 322
Set Up Long-Term Relationships 322
Sell Yourself andWhat You Are Doing 323
Strategize: Choose Your Battles Carefully 323
Formalize Processes and Practices 323
Establish a Central Residency for User-Centered Design 324
Add Usability-Related Activities to the Product Life Cycle 325
Educate Others within Your Organization 325
Identify and Cultivate Champions 327
Publicize the Usability Success Stories 327
Link Usability to Economic Benefits 327
Expand UCD throughout the Organization 328
Pursue More Formal Educational Opportunities 329
Standardize Participant Recruitment Policies and Procedures 329
Align Closely with Market Research and Industrial Design 330
Evaluate Product Usability in the Field after Product Release 330
Evaluate the Value of Your Usability Engineering Efforts 330
Develop Design Standards 331
Focus Your Efforts Early in the Product Life Cycle 331
Create User Profiles, Personas, and Scenarios 331
Afterword 333
Index 335
Share
Related Documents
  1. Usability testing (1549)
  2. [Paid] Feedback Army : Usability Testing for Your Website. (1309)
  3. [Paid] UserFly : Record the actions your users perform on your website (1218)
  4. [Paid] Clixpy : Captures mouse-movement and actions your users (1364)
  5. Usability Testing Studies: Website (3644)
  6. Usability Testing And Roi For User Interface Design (1693)
  7. Selenium Automation Testing Framework for Functional Testing of Web Applications (12806)
  8. Software Testing-Testing Validation (3832)
  9. Web Site Testing Checklist for Usability (1383)
  10. Usability testing methods (1590)
  11. [Paid] Crazy Egg : Visualize every click your visitors make (1490)
  12. [Free] UsabilityTest.com : Usability Testing Tool (1585)
  13. Top 20 practical software testing tips you should read before testing any application. (4553)
  14. 8 Usability Testing Guidelines (1646)
  15. [Ebook] Context-Aware Mobile and Ubiquitous Computing for Enhanced Usability (2493)
  16. Web Testing: Complete guide on testing web applications (5785)
  17. [Paid] CrossBrowserTesting : Test your website (1496)
  18. Mobile Application Testing (14312)
  19. WebSite Cookie Testing (3382)
  20. The role of User Testing in Software Quality Assurance. (1616)