These are three of the most important words for anyone trying to make better decisions with data.
I first heard them from Edward Tufte over a decade ago.
You need a meaningful comparison to turn data into information.
We’re often in such a hurry to get the survey out, start the usability test or conduct user interviews that we forget how we’ll make sense of all the data.
Even the best technology or statistical technique cannot substitute for information that puts responses, ratings and revenue in context.
For most measures of customer experience there are at least three good places to start to look for meaningful comparables:
- A prior version
- An industry average
- A leading competitor
Here are some common metrics and some ideas for making them more meaningful:
Task Time: Time is money. But is a task taking too long or is it fast enough? You really need to have comparable data on the same task because subtle differences in task details can make your comparisons less helpful. For example, it takes Southwest Airlines on average 20 minutes to turn a plane around after it lands and have it ready for the next flight. For the same task, it takes United Airlines 35 minutes (see Gladwell 2010).
Task Difficulty: It’s helpful to use a survey question that’s been vetted psychometrically like the Single Ease Question (SEQ). Even with the right question and response options you’ll want some comparable.
Overall Ease of Use: What do users think of the overall usability of the application? The System Usability Scale (SUS) is a popular usability questionnaire that’s free and short (at 10 items). It provides an overall estimate of usability.
Net Promoter Score: Are customers recommending your software to friends and colleagues? Asking the popular 11-point question is a good start. But is a 25% Net Promoter Score good?
In your next analysis be sure to ask and answer “Compared to what?”