Measuring Usability
Quantitative Usability, Statistics & Six Sigma by Jeff Sauro

20 Tips for your Next Moderated Usability Test

Jeff Sauro • April 10, 2012

Despite the rise in unmoderated usability testing, the bulk of evaluations are still done with a facilitator.

Whether you are sitting next to the user in a lab or sharing screens with someone thousands of miles away, here are 20 practical tips for your next moderated usability test.

  1. Shut up and listen:  You need to talk to moderate a session but don't let the talking get in the way of discovering. Just like in any relationship, you've got to know when to talk, know when to listen and know when to move on.

  2. Measure completion rates: Usability = able to use. Even the most open and unstructured usability test should have users attempting tasks. Record whether users complete or don't complete (1 or 0).

  3. Use confidence intervals around every measure even if you don't report them (don't get fooled by chance). Completion rates, problem frequencies, task times and rating scales all lend themselves to generating informative confidence intervals that tell you and your stakeholders the most likely average for the entire user population. See Chapter 3 in Quantifying the User Experience for examples and calculations for any sample size.

  4. Use at least one structured question after each task attempt that asks a user how usable they thought the task was.  I use the Single Ease Question (SEQ) and a question asking about confidence.

  5. Use a short questionnaire at the end of each session to gauge overall impressions:  If possible use a standardized questionnaire which tends to be more reliable than a homegrown one. Ideally pick one that also allows you to convert a raw score into a more meaningful rank such as the SUS for software or SUPR-Q for websites.

  6. Record the screen so you can always go back and look at task times, double check interactions or look for additional insights. Even though the recording won't change, it's amazing how your perspective can after watching 10 more users.

  7. Pretest: Use any warm body (interns work well) then pretest with a qualified user: For the first few usability sessions you have to get used to the tasks and the system quirks. Be prepared to make changes, improvise and improve early in the testing.

  8. Over-recruit: Plan on no shows. I typically see between 10%-20% of users cancel. Sometimes they call and sometimes they don't.

  9. Have plenty of backup plans: Murphy's Law is alive and well in the usability lab. The test system will go down, the users' phone is too old, you'll forget to record, the audio will fail, the user will be late, the note taker will be sick etc.

  10. Don't wait until you've tested all users before reporting on the problems: Most stakeholders want to know right away what the major issues are without waiting weeks to test all the users and crank out the report. As long as you are clear the results are preliminary it's usually a welcome update.

  11. Track which users encountered which problem: This allows you to estimate the percentage of problems you've uncovered and the sample size needed to uncover the majority of problems (given the same set tasks, user types and interface).

  12. Video of the user's face is nice but not essential: Most of the action will be on the screen (or on the handset). It's nice to get those crazy facial expressions or seeing when the user squints to read the font but if you don't have the face cam, don't worry.

  13. Probe users about interaction problems between task attempts not during: It's usually just a few seconds to minutes after the interaction so the experience is still memorable. Using this retrospective probing technique allows you to collect task-time and prevents interrupting the user or inadvertently seeding ideas such as which path to follow.

  14. Paper and pencil are fine recording devices: they're quiet and quick. I use custom software and excel sheets to record problems, comments and notes but sometimes the non-linear format that doesn't need power and a clicking keyboard work just fine.

  15. Have a note taker and separate facilitator if possible: The facilitator is often kept busy asking follow up questions, troubleshooting technical issues, answering user questions and keeping the study on track. It's easy to miss valuable insights if you're doing both.

  16. Review the observations and problems after each user (when possible): Reviewing the issues when they're fresh with another person such as a note taker or stakeholder. It helps get the problem list out faster and form new hypotheses you can look to confirm or deny in your next set of users.

  17. Record positive issues, suggestions and usability problems: Don't just collect the bad news. Collect those suggestions, positive comments and features that go smoothly. While a development team will often want to get right to the problems, most will appreciate that users and usability professionals aren't all gloom and doom. See Joe Dumas's great book: Moderating Usability Tests: Principles and Practices for Interacting

  18. Illustrate issues using screenshots and categorize problems: Sorting problems into logical groups such as "buttons," "navigation" and "labels" along with a good picture can really help with digesting long lists.

  19. Use highlight videos: Small clips of the most common usability problems or illustrative examples are helpful for stakeholders who rarely have time to pour over hours of video. Every one of our reports is full of data but sometimes the best way to illustrate what the data says is not with a graph but with a gaff. 

  20. Don't lead the user: Even if a user asks if they "did it right" or are going down the wrong path and ask "is this the right way" try and deflect such questions by asking back "what would your inclination be" or "where would you go to look for that?"



About Jeff Sauro

Jeff Sauro is the founding principal of Measuring Usability LLC, a company providing statistics and usability consulting to Fortune 1000 companies.
He is the author of over 20 journal articles and 4 books on statistics and the user-experience.
More about Jeff...


Learn More


UX Bootcamp: Aug 20th-22nd in Denver, CO
Best Practices for Remote Usability Testing
The Science of Great Site Navigation: Online Card Sorting + Tree Testing Live Webinar


You Might Also Be Interested In:

Related Topics

Usability Testing
.

Posted Comments

There are 1 Comments

April 11, 2012 | Alejandro Rivas Micoud wrote:

Hi Jef, thanks for the post, very useful. That said, I disagree with you about the relative importance of seeing the user's face; at userlytics.com we have conducted thousands of remote "Video-in-Video" sessions that include a webcam view, some unmoderated, some moderated, and I can assure you that our clients love the added value and context of seeing the user in their environment (and knowing that it is really their target persona, and not a professional tester); once they've tried the platform that includes the webcam view, they keep using it, is very impactfull, especially in moments of silence when they are thinking or are frustrated 


Post a Comment

Comment:


Your Name:


Your Email Address:


.

To prevent comment spam, please answer the following :
What is 5 + 1: (enter the number)

Newsletter Sign Up

Receive bi-weekly updates.
[4183 Subscribers]

Connect With Us

UX Bootcamp

Denver CO, Aug 20-22nd 2014

3 Days of Hands-On Training on User Experience Methods, Metrics and Analysis.Learn More

Our Supporters

Usertesting.com

Loop11 Online Usabilty Testing

Use Card Sorting to improve your IA

Userzoom: Unmoderated Usability Testing, Tools and Analysis

.

Jeff's Books

Quantifying the User Experience: Practical Statistics for User ResearchQuantifying the User Experience: Practical Statistics for User Research

The most comprehensive statistical resource for UX Professionals

Buy on Amazon

Excel & R Companion to Quantifying the User ExperienceExcel & R Companion to Quantifying the User Experience

Detailed Steps to Solve over 100 Examples and Exercises in the Excel Calculator and R

Buy on Amazon | Download

A Practical Guide to the System Usability ScaleA Practical Guide to the System Usability Scale

Background, Benchmarks & Best Practices for the most popular usability questionnaire

Buy on Amazon | Download

A Practical Guide to Measuring UsabilityA Practical Guide to Measuring Usability

72 Answers to the Most Common Questions about Quantifying the Usability of Websites and Software

Buy on Amazon | Download

.
.
.