This is the fourth, and last, entry in a series about how testers should remember usability when they’re testing websites, using the O2 registration process as an example.
The previous posts can be found at “Usability and O2 registration”, “O2 website usability: testing, secrets and answers” and “O2 website usability: beating the user up”.
If you remember, I only started out on this expedition through O2’s registration process because I’d linked my personal and business phones. My business phone then disappeared from view. There was no sign of it on my personal account and the business account ID was crippled. I couldn’t log into it, and so I decided to re-register the business phone and plodded my painful way through O2’s registration process.
The mystery of the missing phone account – solved!
After successfully re-registering my business phone I logged back into the personal account and stared at the “My O2” page, wondering why the original business account had vanished.
I decided to go ahead and link the new business account to the personal one. Maybe I should pretend I was trying to replicate a bug, but who am I trying to kid? There was no-one to report it to. Just like a little boy, I broke it once, then wanted to do the same thing again to see if would break again.
Whatever I was doing when I clicked again on the button under “Link my accounts” I was not looking for my business phone. Why would I look there? But there it was.
This is the screen that faced me. I’ve masked the ID’s and my personal number. My original business phone account hadn’t disappeared after all. It had just been remarkably well hidden.
Information architecture – it matters
I’ve seen information architecture derided as a trendy and pretentious field – but only by people who don’t know what they’re talking about. The O2 site is a good example of why it’s needed. Nielsen and Loranger have this to say in their book “Prioritizing Web Usability”.
“Chaotic design leads to dead ends and wasted effort. Hastily thrown-up web sites without effective information schemes prevent users from getting to the information they seek. When this happens, they may give up, or even worse, go to a different site.
A well-structured site gives users what they want when they want it. One of the biggest compliments a site can get is when people don’t comment on its structure in user testing. It’s the designers’ job to worry about the site’s structure, not the users’”.
Precisely; a well structured site gives users what they want, when they want it. They shouldn’t have to puzzle it out. They shouldn’t have to think, “now what do I do?”. They should just be able to do it. It’s that simple, but companies keep on screwing it up and wasting our time. Steve Krug has written an excellent short book on the subject with the very apt title “Don’t make me think!”.
When I originally linked my accounts I should have been able to look at the “My O2” page and instantly know where I’d have to go to see my business phone account. The site works functionally, but burying the account details of the other phones beneath “Link my accounts … Find out more” was guaranteed to confuse users.
Actually, on second thoughts, this is maybe not such a great example of the need for information architecture. The problem should have been so obvious that anyone could have spotted it early. You shouldn’t have to be any sort of specialist to see this one. The testers should have called it out, unless of course they were sticking rigidly to scripts intended to establish whether the functionality was working according to the documented requirements.
Did no-one put their hands up and say, ”this is daft”. Surely that is part of a tester’s job?
You wouldn’t even need to re-visit the site structure to address it. Simply changing the labels and text would have made the page much less confusing. It might not have been elegant, but the change could have been made very late and it would have worked.
Ideally, however, the problem would have been detected early enough for the site structure to be changed easily, i.e. during design itself. The same applies to the other O2 usability problems I discussed in previous blogs. If usability defects are found only during the user testing at the end of the project then it’s usually too late. The implementation express is at full steam, and it’s stopping for no-one. The project manager has got the team shovelling coal into the engine as fast as they can, the destination’s in sight, and if some testers or users start whining about usability then they need to be reminded of the big picture. “Ach, it’s just a cosmetic problem and, even if it isn’t, there’s a workaround”.
Early testing – what it should mean
If usability testing is to be effective it has to be built into the design process. Under the V Model testers learn the “test early” mantra. In practice it often amounts to little more than signing off massive specifications, without enough time to read them properly, never mind actually inspect them carefully. The testers then get to work churning out vast numbers of test scripts, the more the better. The more test scripts, the more thorough the testing? Please don’t lift that quote out of context. I value my reputation!
No. Early testing has to be formative, early enough to shape the requirements and the design. We have a lot to learn from User Experience (UX) professionals. The UX people are not called in as often as they should be. There are no end of techniques that can be used, and they don’t seem to be part of the testers’ normal repertoire.
Nielsen’s Discount Usability Engineering and Steve Krug’s “lost our lease” usability testing in particular are worth checking out. This article I wrote for Testing Experience has a bit more on the subject, and detailed references. See the section “Tactical improvements testers can make”.
What this series has been all about
There are other problems I could have discussed. In particular, there’s evidence of a lack of basic data analysis. It looks like the designers didn’t clarify exactly what the data meant. I’ve never seen a defect reporting that the data analysis was flawed. That would be the root cause. The defects would probably be symptoms of that deeper problem. The point is that proactive user involvement at an early stage could have detected such root problems.
The theme of this series of articles about O2 isn’t how awful that company is. They’re no worse than many others. They just happened to come to my attention at the right time. It’s only partly about the importance of usability, and incorporating that into our testing. That is certainly important, and much neglected, but the real message is that testers have to be involved early and asking challenging questions of the whole team throughout the development.
Testers should never be constrained by scripts. They should never be expected to derive all their tests from documented requirements. Taking either approach means having your head down, focussing on one little bit after another. We should have our head up, making sure that people understand and remember the big picture.
“What are we doing? What will the users want to do? Does this particular feature help? Is it a hindrance? Why are we doing it this way? Could it be done better?”
Questions like that should be constantly running through our minds. As Lisa Crispin put it in a tweeted comment on one of the earlier O2 posts, we shouldn’t be testing slavishly to the requirements. We should be thinking about the real user experience, and using our own business knowledge. Exactly!