The Most Fundamental Concept in Usability
Jeff Sauro • June 10, 2014
Usability is a lot of things.
It's about making interfaces easy to learn. It's about preventing errors and reducing the time to get things done.
It's even about making an experience more satisfying and delightful.
There are a number of methods
to improve the usability of an interface. While it's hard to identify one overarching concept that's fundamental to the whole idea of usability, I think there's one that underlies most methods and desirable outcomes. That concept is that the developer is not the user.
That is, the people who build the interface are different from the people who use the interface.
Developers can be users too of course. In some rare cases, the developer programming the interface or designing the look and feel is one of the people who may eventually use the interface themselves.
But because of the high demand to build software, mobile apps and web-applications, it's much more likely that the developers know very little about the target users and will never use the applications they are developing.
The reason this is such a fundamental concept is that the very act of detailing how an application will work, the programming and the designing, involves making assumptions about how people will react to and interact with design elements. The technical structure of programming languages and the functional requirements take priority over ethereal user goals.
The problem is usually compounded because development teams are often given details about users and functional specs via proxies (product managers, business analysts, etc.) and have little contact with users themselves.
Developer != User
The famous study by Stanford Professor Phillip Zimbardo
showed that you can take ordinary people and give them a role (prisoner or jailer) and people will adapt to the role--thinking and acting quite differently. Alan Cooper
, the father of Visual Basic, takes the Zimbardo idea and says that to be a good programmer, one must be "sympathetic to the nature and needs of the computer. But the nature and needs of the computer are alien from the nature and needs of the human being who will ultimately use the software."
The product managers and business executives who order the development of the software that fills our lives aren't the ones in control. It's the engineers who are running the show. Cooper argues that this is tantamount to the inmates running the asylum!
A recent discussion on Quora about things programmers know that the rest of us don't illustrates this point. One web developer reinforces
Cooper's point by stating:
About 25% of the hours spent writing an application are spent figuring out ways the end user will do something wrong.
For example, I have four bills in front of me
- One for renewing a magazine subscription
- One to renew a satellite radio subscription
- Two for medical expenses that I need to pay via a Health Savings Account (HSA)
The magazine company and the satellite radio company don't make their money by developing web applications. The HSA is operated by a large US bank, which is closer to the business of bill paying but it's easily the worst experience.
- The language on labels and navigation are confusing (Add an Expense vs. Make a Payment?)
- Back buttons don't work
- Legitimate account numbers are rejected as invalid
- Error messages make no sense
No doubt I was doing something "wrong" in the way I was trying to pay my bills. Yet there's also little doubt that having a handful of users trying to pay bills would uncover many of the problems I was having. It most likely would have prevented the calls to customer service that I'll be making soon too!
Appreciating the Roles to Make More Usable Software
Cooper made the case that programmers should design for a single user, not them, but a Persona
. Personas are archetypical users whose goals and characteristics represent the needs of a larger group of users. They function as stand-ins for real users to guide decisions about design and functionality.
While Personas have a role in keeping a focus on the user, I have two additional suggestions.
- Developers should observe users: Watching a handful of representative users attempting to use software, before it's built, is the most effective way to understand users. There are best practices, and research and product specifications, but no style guide can ever tell you what a real person will do when trying to accomplish tasks. It's from watching those few users that assumptions get tested -- from observing people who have no idea about the intricacies of the code, the data model, and all the business rules that went into the design. Problems can then be fixed before it's too late and users start inundating call centers to figure out how to pay their bills online.
- Usability professionals should code: Learning to code, even rudimentary coding, is a valuable skill for a person in UX. Understanding the structured demands of databases and programming languages is the best way to appreciate the constraints of a developer. I've built a few web applications myself and was always one of the intended users. Each time I used my own software I was surprised how I missed some basic problems in navigation or how I wanted to get things done differently once the software was even just partially functioning. Watching a few other people use my software was humbling, yet enlightening. I'm not suggesting UX professionals become developers, but I am suggesting that a lot can be learned about improving interfaces if you can appreciate the complexities and constraints that go into building them.
Developers aren't evil and the UX folks aren't sages. Observing users, conducting a top-tasks analysis
, running a card-sort
and performing a contextual inquiry are all methods we can use to make more usable experiences. They are also methods that have one underlying fundamental concept: the developer is not the user.