Measuring Usability
Quantitative Usability, Statistics & Six Sigma by Jeff Sauro

10 Things That Can Go Wrong in a Usability Test

Jeff Sauro • February 18, 2014

A lot of planning goes into a usability test

Part of good planning means being prepared for the many things that can go wrong.

Here are the ten most common problems we encounter in usability testing and some ideas for how to avoid or manage them when they inevitably occur.
  1. Users don't show up : No-shows are a fact of life for usability testing.  We email, confirm by phone and then send reminder emails and phone calls and still have users who don't show up. On average we anticipate about a 10% no-show rate, regardless of the honorarium. For that reason we usually recruit 10% more than we need and have extra users on a stand-by list.

  2. Facilitators gets sick: We're all human, even amazing usability facilitators can get sick or just lose their voice. Having a backup facilitator (e.g. the note taker takes over) or have a little wiggle room in your testing schedule to reschedule when everyone is healthy.

  3. Internet goes down : It's difficult to run a usability test without the internet, whether it is to access a website, host a remote meeting, or access web-based software to record data in the session. It's amazing the number of freak outages or slow speeds we've seen while conducting testing. One of the only ways to minimize the impact of outages is to have redundant internet, ideally from two separate providers if possible—although this can be both expensive and not always possible at many locations.

  4. Awkward moments: A good facilitator has patience and empathy and a test session can often feel a lot like a session with a therapist.  Sometimes people just need a sympathetic ear and we've heard quite a few stories about ex-wives, ex-husbands, ex-boyfriends, etc. while testing.  The best thing to do is to keep it professional and steer the participant back on track without being rude.

  5. Distractions: Remote moderated tests are great for recruiting a geographically representative sample and having users in the comfort of a familiar environment. But as anyone who works from home knows, there's often a lot going on. When conducting remote moderated tests we'll often hear dogs barking, kids crying, doorbells ringing or spouses wondering who people are talking to. These distractions are a curse and a blessing as they present us with both more realistic testing conditions than a secure lab based setting, but too many distractions can lead to longer sessions and misleading results.

  6. Users are quiet:  For think aloud protocol to be effective, you need users to, well, think aloud. It's hard to know ahead of time who will be better able to articulate their thoughts.  On occasion you'll just get users who say very little and offer very little in insight and no amount of probing will get the taciturn participant to talk more.

  7. Software stops working : Like Murphy's law, you can almost always count on the product to fail during testing. This is especially the case when working in development environments, testing servers or with prototypes.  Look to have backups and the technical support folks' phone numbers on speed dial, or better yet, have them observe the sessions too.

  8. The study takes too long: Even with pretesting tasks and trying not to cram too much in, you'll often only get through a portion of your study. It can be a slow participant, slow internet, buggy application or just too many tasks. You have to balance running long (with the participant's permission) and not interfering with upcoming sessions. Often randomly selecting a subset of tasks will be a good compromise to cutting the session length down.

  9. You forgot to record the time: When recording task time you have to remember to start and stop the time. There's a lot going on during a study and things get forgotten which is why it's nice to have a notetaker for backup.  If you can find a way to automate the time recording it can help eliminate an opportunity for failure--for example by having the time start automatically when the user is shown tasks electronically. Otherwise, having a video backup of the session will allow you to go back and adjust the times—just don't count on the video always working.

  10. Video didn't record: Recording the screen and/or face of the user is nice to have for creating highlight clips and reviewing the sessions again for additional data and insights. Unfortunately it seems like the world works against you when recording video.  You'll need to remember to start the recording, hope it doesn't fail or get corrupted and that it doesn't interfere with or slow down the users' experience.


About Jeff Sauro

Jeff Sauro is the founding principal of Measuring Usability LLC, a company providing statistics and usability consulting to Fortune 1000 companies.
He is the author of over 20 journal articles and 4 books on statistics and the user-experience.
More about Jeff...


Learn More



You Might Also Be Interested In:

Related Topics

Usability Testing
.

Posted Comments

Newsletter Sign Up

Receive bi-weekly updates.
[4241 Subscribers]

Connect With Us

.

Jeff's Books

Quantifying the User Experience: Practical Statistics for User ResearchQuantifying the User Experience: Practical Statistics for User Research

The most comprehensive statistical resource for UX Professionals

Buy on Amazon

Excel & R Companion to Quantifying the User ExperienceExcel & R Companion to Quantifying the User Experience

Detailed Steps to Solve over 100 Examples and Exercises in the Excel Calculator and R

Buy on Amazon | Download

A Practical Guide to the System Usability ScaleA Practical Guide to the System Usability Scale

Background, Benchmarks & Best Practices for the most popular usability questionnaire

Buy on Amazon | Download

A Practical Guide to Measuring UsabilityA Practical Guide to Measuring Usability

72 Answers to the Most Common Questions about Quantifying the Usability of Websites and Software

Buy on Amazon | Download

.
.
.