Compared to What? Making Sense of Customer Experience Metrics
Jeff Sauro • September 28, 2011
These are three of the most important words for anyone trying to make better decisions with data.
I first heard them from Edward Tufte
over a decade ago.
You need a meaningful comparison to turn data into information.
We're often in such a hurry to get the survey out, start the usability test or conduct user interviews that we forget how we'll make sense of all the data.
Even the best technology or statistical technique cannot substitute for information that puts responses, ratings and revenue in context.
For most measures of customer experience there are at least three good places to start to look for meaningful comparables:
- A prior version
- An industry average
- A leading competitor
Here are some common metrics and some ideas for making them more meaningful:Task Time
: Time is money. But is a task taking too long or is it fast enough? You really need to have comparable data on the same task because subtle differences in task details can make your comparisons less helpful. For example, it takes Southwest Airlines on average 20 minutes to turn a plane around after it lands and have it ready for the next flight. For the same task, it takes United Airlines 35 minutes (see Gladwelll 2010
Tip: When measuring task time it can be difficult to find prior data for the tasks you're measuring. One trick I use is to have the users attempt the task on the new interface and on the older version during the same evaluation. Alternate which version the participant gets first to minimize sequence effects. At the end of your test you'll know whether all those new features are slowing users down.Task Difficulty:
It's helpful to use a survey question that's been vetted psychometrically like the Single Ease Question (SEQ)
. Even with the right question and response options you'll want some comparable.
TipOverall Ease of Use
: Consider comparing the actual task difficulty rating with the expected task-difficulty rating
. You can ask the same users or different users how difficult they think the task will be. The gap in expectations and retrospective accounts can reveal interaction problems.
: What do users think of the overall usability of the application? The System Usability Scale (SUS)
is a popular usability questionnaire that's free and short (at 10 items). It provides an overall estimate of usability. Net Promoter Score:
Are customers recommending your software to friends and colleagues? Asking the popular 11-point question
is a good start. But is a 25% Net Promoter Score good?
: Compare your Net Promoter Score to industry benchmarks. If it is consumer software, 25% isn't bad (it's slightly above average) but far below the leader TurboTax
In your next analysis be sure to ask and answer "Compared to what?"