O2 website usability: beating the user up

I’m continuing my discussion of the O2 registration process, and showing how it’s a good example of how testers can bring a more creative approach to testing than just ticking off the test cases. See the previous two posts from which this one follows on; “Usability and O2 Registration” and “O2 website usability: testing, secrets and answers”.

The next problem with the registration form was clearing valid input fields when an error was detected.

This is something that really bugs me. Ok, if I’ve entered something invalid then I deserve to be punished by the application. I’ll take it like a man. Just give me the snotty error message, and clear that field. But clearing other fields that are fine? Well that’s just mean.

Now I’m straying outside my comfort zone here and if this were a real project I’d be asking questions that coders might consider really dumb. On the other hand, they might be shifting uncomfortably in their seats. Fortunately I spent a few years as a computer auditor asking exactly those sort of questions. It’s not that I developed a thick skin (though I did). It’s more that I realised you can get a pretty high hit rate with questions starting “sorry if I’m being a bit dim here but …”.

Server side validation?

Wiping correctly completed fields looks rather like developers are relying on server side validation. If there’s a problem they then have to regenerate the form with an error message.

click on image to view

Does server side validation mean that they’d have to send the password and validation code back from the server to rebuild the form? Is that considered a security exposure? I don’t see why, but I’m not sufficiently technically aware to know for sure. Please someone, enlighten me.

If that is a potential weakness, why are they doing all the validation at the server? Sure, the definitive validation has to be at the server, or you’re leaving yourself wide open, but really, all of the validation? I don’t know the answers. I’m just asking. “Maybe I’m being dumb, but …”.

A further problem with server side validation is that users are given the chance to choose an account name. O2 have over 20 million customers. The chances are that a user’s first choice will have been taken. The validation code and passwords will be removed from the form each time the user asks unsuccessfully for an account name.

This is quite a common problem and there are solutions. You don’t have to make your users suffer. You can use Ajax to communicate with the server in the background without having to refresh the form. Applications can therefore check proposed account names as the user is typing, and provide immediate feedback. Maybe there’s a good reason why Ajax hasn’t been used, maybe not.

The point is to ask, and force developers to justify whacking the users over the head. You mustn’t do it in an aggressive or sarcastic way. People often do things in certain ways because it’s convenient, without questioning their motives. It’s helpful to be forced to sit back and think just why we’re doing it that way, what the implications are, and if that route really is the best.

It’s also vital that you don’t come across as telling the developers that their technical solution is wrong and that you know better. Almost certainly you don’t. You’ve just got a different perspective. If the developers think you’re fronting up to them in a technical showdown then they’ll humiliate you, and frankly that would serve you right!

Do we really need all of these fields?

Next, why are O2 asking for my name and address? At first that seems like a daft question. I’m registering. Of course they want these details. It’s a no brainer. That seems to be the extent of the thought O2 gave to the matter.

However, I’ve got contracts for both phones. O2 already know my name and my address. There’s no validation of the name and address against the details that are already held. Will O2 treat my input as a change to my account details? Actually, they won’t. I know that from experimenting with the site.

So why bother capturing the details? Why bother risking confusion later when they realise they have two addresses for the same account, or when the customer is puzzled that O2 aren’t using the new address? These shouldn’t be rhetorical questions. Testers should be asking questions like that.

Something else that puzzles me is the large number of text boxes to capture the address. Why do they need the address pre-formatted like this? Is this just for the convenience of the coders?

The postcodes should be validated, and should be captured in a separate box. At a pinch it can be justified having the house number, or name, captured separately, especially if the correct postal address is being generated. I’ve seen only two half-way convincing arguments for multiple address boxes.

Firstly, there’s the “users are idiots” argument. They will enter garbage in the address field given half a chance. Secondly, fraudsters love to mangle addresses to make investigation more difficult.

The first argument has some merit, but multiple input boxes don’t help much. Users can still enter rubbish, and that’s why the second argument doesn’t really stand up. I’ve worked on many fraud investigations, and separate input boxes is no handicap to a fraudster.

Insisting that users input their address in a certain way risks annoying them. O2 are not guilty of it here, but some companies use drop down menus with errors in them. My address on Ebay and PayPal is wrong because they’ve made it impossible to enter my correct address. They don’t understand how the system of Scottish postal addresses works. Stuff still gets here, but it annoys me every time I see it.

Splitting the whole address up into a series of separate boxes looks very much like inertia. That’s the way it’s always been done – no need to question it.

Did anyone question why O2 need to capture all the other mandatory fields? Are these genuinely essential, or just nice to have? It’s not as if users have to provide the information. They might react by cancelling the registration process, and denying O2 any benefit at all. Or when they realise they can’t continue without entering anything, then that’s just what they’ll give you; anything! What value does management information have if you’ve goaded users into entering false data?

I recommend this blog piece by David Hamill on the subject. His blog has lots of other good ideas and discussions.

Forms should not be designed by throwing every possible field that could be captured at a wireframe and then telling the coders to crack on. Each question should have to be justified because each extra question will tip some users over the edge.

There’s a trade-off between capturing information and making things easy and pleasant for customers. If your customers think you’ve gone too far in the wrong direction then they’ll punish you.

What’s the big picture?

Specialists focus on their own area of expertise, in which they are supremely confident. They expect others to defer to them, and in turn they defer to other experts in different areas.

Sometimes the whole process needs someone to ask the pertinent questions that will open everything up and help people to see the big picture. Testers can do that.

This look at the O2 registration process has been more interesting than I expected, and I’ve had more to say than I ever thought at the start. However, I promise to pull it all together with my last piece, “The O2 registration series – wrapping up”.

O2 website usability: testing, secrets and answers

I left off my previous blog about the usability of the O2 site’s registration process when I’d got the validation code, which was texted to my phone, and I was about to move on to the registration form itself.

This form is a gem if you’re wanting to look at poor usability and sloppy testing, but if you’re an O2 customer then it’s a charmless mess.

  • It applies inappropriate validation, and compounds the problem with an inaccurate error message.
  • It seems too reliant on server side validation.
  • Valid input is cleared if an error is detected in another field.
  • It has too many text input boxes, and too many of them are mandatory.
  • The form as a whole, and the process it supports, don’t seem to have been thought through.

I’m going to deal with only the first issue in this post because it’s not a straightforward matter of usability. It highlights the security weakness of secret questions and answers.

click on image to view

Do companies allow coders to write error messages for users?

I entered the six digit validation code and all the other details, including choosing a user name and password.

I had to choose a security question from the usual set, i.e. mother’s maiden name, first school etc. I chose my mother’s maiden name, then entered the name, “O’Neill”. By the way, that is neither the question I chose, nor the value I entered, but it serves my point.

I got the following error message.

“Security answer must contain letters and numbers and be 1-50 characters long”

That seemed a bit odd, but I stuck a couple of integers on the end of the name.

It then became clear that the validation code and passwords (initial entry and confirmation) had been removed from the refreshed screen that had come back with the error message. It wasn’t immediately obvious that the verification code had been removed because I was below the fold and the message was out of sight.

So I re-entered all the details and tried to submit again. It still didn’t accept my mother’s name, and the validation code and passwords had gone again.

It then dawned on me that when they said that the answer must have letters and numbers they didn’t actually mean that. Maybe they meant that it couldn’t have special characters? So I tried removing the apostrophe from O’Neill.

Yes, that was the problem! They’d created a freeform text input field with validation to stop special characters being entered, and the error message doesn’t actually say so. Oh dear!

However, I didn’t have the chance to enjoy my success. Now the user name I’d chosen was flagged up as being unavailable. And yes, the passwords and validation codes had been wiped.

I re-entered everything and tried another user name. No joy, and of course I’d lost my data again. Next time I just asked them to select a user name for me.

Success at last! I’d registered.

Filling in the form had taken far longer than it needed to, and had left me exasperated because O2 had ignored some basic usability rules.

The validation didn’t make sense for the input that was requested. If you’re asking for freeform text then you should allow for special characters. If you do decide that they are unacceptable then you should make that clear before users input their data, and you should ensure that your error messages are also clear on the point.

Why ban special characters?

The only special characters on my keyboard that were acceptable were hyphens, commas and full stops.

I wish I were more technical and could identify with confidence what O2 were doing, but it looks suspiciously like a very clumsy defence against SQL injection attacks. As far as I know it’s not necessary to ban special characters from free-form text input fields. Programmers should be sanitising the input to deal with potential attacks, or using bound database variables so user input is strictly segregated from the executable code. They should shouldn’t they? Help me out here!

Anyway, even if it is a reasonable precaution (which I doubt) to ban special characters that could result in user input being treated as executable code, surely it should just be the dangerous characters that should be banned?

Beware of not very secret secrets!

Secret questions and answers are a notorious security weakness. They can be ridiculously easy to guess, especially if you know the customer. For a quick introduction to the subject, check out this recent paper from the University of Cambridge Computer Laboratory.

Some people choose to use special characters in their secret answers to make them harder to crack. It hardly makes sense to stop them.

If you are going to use them you should really allow the users to choose their own question and answer. If you really must insist on giving the user no choice then don’t use the same old obvious ones that O2 have.

  • Mother’s maiden name
  • Name of first school attended
  • Name of your pet
  • Favourite sports team
  • Favourite animal
  • Place of birth

That’s a dreadful set of questions. Any number of people outside my immediate family would either know the answer to most of these, or be able to take an informed guess.

To make it even worse O2 have suggested you set up a user name using “the name of a favourite pet, footie team, house, school, street or town in combination with a number such as your date of birth, house number or mobile number.

That’s enough for now

The poor error message, the dubious validation and the rather naïve use of secret questions all combine to give a poor impression of the site. If your approach to testing is based on deriving scripts from requirements, then I doubt if you’ll detect such problems. Rather, you may see them, but you won’t see them as being problems. Even if you do think that they are a problem it might be difficult to persuade the developers.

I’ll return in a day or so to discuss this further in “O2 website usability: beating the user up”, and talk about how testers can and should try to prevent these problems occurring, rather than just complaining about them when it’s too late to make a difference. Basically, it’s a matter of being able to ask awkward questions at the right time!

Usability and O2 Registration

I am a customer of O2, the mobile (cell) phone company. I recently came across their registration process for customers, and it contained so many problems that it seemed a fine example of what happens if usability is given a back seat in development, and if testers fail to get involved early and ask challenging questions.

I’ve got no involvement with O2, other than as a customer, so I’m guessing at their approach to usability and testing. However, if they do think that they applied usability principles to the registration process, and tested it properly, then the issue simply becomes one of competence rather than commitment.

Identifying individual clients

I only looked at the registration process because of problems I encountered trying to link my business and personal accounts.

For almost as long as large companies have been processing customer data they have been struggling with the problem of identifying individual customers with multiple accounts.

There are numerous benefits if you can crack the problem. You can target direct sales campaigns more effectively. You can develop a deeper understanding of which parts of your market are profitable. You can provide a better customer service.

It’s surprisingly difficult to pin down an individual, however. Names don’t really work well. I could be James Christie, James D Christie, JD Christie, Jim Christie etc. Also, it’s common to have two people with identical names at the same address; different generations of the same family.

Addresses are difficult to work with too. There are so many ways to write them or abbreviate them. Post codes help, but people can get them wrong, and they can cover dozens of separate properties. I once lived in a building with six separate flats. In three of them were Wilsons, all unrelated, all living at 28 Dunkeld Road with the same post code.

So if you can get the customers to put their hands up and say, “yup, I’m both Jim Christie and James Christie. We’re one and the same” then it’s pretty useful. That’s the sort of thing companies like to encourage.

That’s exactly what O2 have done on their website. I’ve got two mobile phones, one business and one personal. Both are with O2. The phones are on separate accounts, but I noticed on the O2 website that it’s possible to link accounts so that you can manage both phones with one login. Great! That’s handy for me, and it’s obviously very useful for O2 too.

I happily signed up, linking the two accounts together. What should have been a classic win-win, in which O2 get more precise customer information, and I can manage my accounts more easily, turned into a bit of a bit of a mess that left me wondering about how O2 deal with usability and testing. It still amazes me when companies try to do something that should benefit customers and then blow it with lousy usability.

Where’s my phone gone?

Nobody got hurt, no-one ended up in court. It was no big deal. It was just that what should have been an easy, positive experience became irritating and made O2 look sloppy. It’s so unnecessary. Getting it right really shouldn’t be all that much harder than botching it, but O2 gaily broke just about every usability rule you could think of.

Linking the accounts was actually very easy. I was logged in to my personal account at the time. So I went ahead and entered the business phone number, and the business account ID and password. That was it. The two accounts were linked. So far, so easy. I then went looking for the business phone. There was no sign of it. The personal account still just had one phone.

I logged out, and back in again. Still no sign of the business phone. I logged out again and tried logging in to the business account. I got a message that the account ID was not recognised. I started to get interested now, and was wondering just what O2 had done.

When I was told that the business account ID was not recognised I was offered the option of registering a new account ID. I entered the business phone number, but was told it was already registered under the personal account ID. I went back into the personal account, but there was definitely no sign of the business phone.

That left me baffled, so I thought I’d register the business phone again, but under a new account ID.

Did O2 really think the registration process through?

I logged out, went to the registration page, and entered my business phone number. That brought up this really weird screen.

O2 already registered screen

Apparently, if I can remember my username and password I should select “Continue to Registration”, so that I can be given a new username and password! Very odd.

The options offered by O2 are, in effect;

a) If you remember both ID and password, go to new registration.

b) If you remember the ID but not password, go to new password.

c) If you remember the password but not the ID, go to ID prompter.

d) If you’ve forgotten both the ID and the password go to new registration (which is where “forgotten username & password” takes you, the same as “a”).

How much thought went into this screen? Didn’t anyone look at it and think that it was daft? It’s not just they don’t seem to have proof-read the screen, the process doesn’t make sense. Remember O2 already know who I am. I’ve got a contract with them for the phone. This registration is to let me manage the account on-line.

Testers should be calling out this sort of problem. If you’re relying on scripts you’re maybe not approaching the exercise in the right frame of mind. Scripts tend not to say, “does the form as a whole make sense, y/n”. They concentrate on the functionality, often at a detailed level that inhibits testers from saying “hang on, this is daft”.

To be sure I have the phone for that contract O2 now text a 6 digit code to my phone, whether I want a new password, a reminder of the account ID or to re-register. This has to be entered on the next page.

So to set up a new account ID all I need is possession of the phone.

Just to emphasise the pointlessness of the whole process O2 will text me the account ID and password after I register. But they won’t text me the account ID and password to let me keep using the existing account ID. They force me to set up a duplicate one, which is still going to be managing the same account.

This takes me on to the form for registering. It seems like a statement of the blindingly obvious, but forms should surely be the one area of a website that you really must test on users.

By testing on users I mean testing early enough to shape the form, not rounding up a few stray bodies to tell you it’s rubbish too late to make any significant changes!

For all the pleading to be allowed to “test early” testers have traditionally been more comfortable focussing on testing late in the life-cycle. Sure, project pressures have forced them into testing late, but I really don’t think there has been sufficient will to break out of the ghetto and help shape applications at an early stage.

I believe that testers should be asking the sort of difficult questions I’ve mentioned here; “why have you done it that way?”, “have you thought about how the users will actually use it, have you thought about the users at all?”

If testers are to be effective, bring real value to their employers, and justify their salaries they have to be prepared to challenge, to do more than just say; “does the output match what was predicted in my script”.

I’ll return to the O2 registration form in a day or so, and discuss other difficult questions that testers should be posing in “O2 website usability: testing, secrets and answers”. I’d love to see what other people think. Please feel free to jump in now, or perhaps wait till my four part rant is over!

Test script production lines

I was startled yesterday by a series of Tweets from Phil Kirkham.

Shuddering at a sentence that starts ” The basic idea is that we have a Test Script production line…”
It is staffed by a newbie tester and another one new to the project who knows nothing about the app or the business.
Testers churning out test scripts from specs , handing them onto the ‘execution line’ and move onto the next spec.

Now my first reaction was no doubt the same as all right thinking, sensible folk. I shuddered, thought “there but for the grace of God…”, and remembered past projects which had got uncomfortaby close to such an abomination.

Then I thought a bit longer about this, and wondered if the “production line” heresy might offer an opportunity for testers who are wanting to stir things up, to make a difference and improve testing.

All too often companies pay lip service to testing. Everyone believes in it. Plenty don’t want to do what is really necessary to let the testers do a good job. However, the unspoken assumptions and prejudices stay stubbornly hidden under the surface. We know they’re there, but the people with influence never state their deep seated views in public, in a form where it can be questioned and challenged.

I’ve worked on projects where the speed and efficiency with which the scripts could be churned out and then executed was clearly more important than whether the testers were really learning about what the application could do, and what the level of risk would be if it were launched.

Management never said that impressive daily progress cranking out scripts was the real bottom line as far as they were concerned, but we all knew that was the reality.

If I’d suggested that the attitude of management made it clear that we really should have a production line, with operatives specialising in each successive step of the process it would have looked cheeky, verging on the insolent.

Yet the production line is the logical extension of the obsession with scripts, with the number produced and executed. Phil’s case is the “reductio ad absurdum” of the script approach. It’s taking scripting to its logical, but insane conclusion.

The people who’ve proposed it have put their head above the parapet, declared effective testing is a matter of efficient production of scripts, and maybe now they can be challenged.

Scripting is an inefficient way to test.  Detailed scripting is especially inefficient, and if different people are writing and executing the scripts then the inevitable consequence is a high level of detail. The test executors have to be told what to do. They will have minimal previous exposure to the application or new functionality. They’re too busy “specialising” in execution.

The production line is simply a way of streamlining a process that is inefficient and ineffective. It makes it clear that the process has become the “end”, not the testing.

Adopting a production line opens up the debate about what testing should be, and how testers can be deployed most effectively. There was an interesting study at the University of Helsinki (PDF – opens in new window) a few years ago in which exploratory testers achieved pretty much the same results as script testers, but with 18% of the effort.

There are all sorts of complaints one could make about the methodology and the analysis. The study didn’t even make any great play of that 18% figure. However, there should be enough in that paper to make even the most blinkered scripting advocate pause and think.

Maybe proposals like the production line give us an opportunity to say, “hang on – just what is it we’re trying to do here?”

I expanded on the subject of heavy documentation in an article on testing and standards that I wrote for Testing Experience in December 2009.

Right now my Saturday job as a football reporter calls, and I’ve got to get off to the game! (PS – we won 🙂 ).