Testers are like auditors

No, this isn’t another “life is like a box of chocolates”. It’s not a tortured metaphor to try and make a point. I mean it straight up. Testers and auditors are very similar, or they should be.

In May I went to hear Michael Bolton speaking in Edinburgh. He touched on the distinction between testing and checking. If you’re not familiar with this then go and check out this article now. Sure I want you to read my article, but Michael’s argument is important and you really need to understand it.

I talked to Michael for a few minutes afterwards, and I said that the distinction between testing and checking applied also to auditors and that I’d had arguments about that very point, about the need for good auditors to find out what’s really going on rather than to work their way down a checklist.

I mentioned other similarities between testing and auditing and Michael agreed. Usually people are surprised, or think the connection is tenuous. Recruitment consultants have been the most frustrating. They see my dual experience in testing and auditing as an oddity, not an asset.

I said I’d been meaning to write about the link between testing and auditing, especially the audit of applications. Michael told me to do it! So here it is.

It’s about using your brain – not a script

If you don’t know what you’re doing then a script to follow is handy. However, that doesn’t mean that unskilled people with prepared scripts, or checklists, are adequate substitutes for skilled people who know what they’re doing.

As Michael said in his talk, scripts focus our attention on outputs rather than outcomes. If you’re just doing a binary yes/no check then all you need to know is whether the auditees complied. You don’t need to trouble your brain with tricky questions like; “does it matter?”, “how significant is it?”.

I’ve seen auditors whose job was simply to check whether people and applications were complying with the prescribed controls. They asked “have you done x to y” and would accept only the answers “yes” or “no”. I thought that was unprofessional and potentially dangerous. It encourages managers to take decisions based on the fear of getting beaten up for “failing” an audit rather than decisions based on commercial realities.

However, that style of auditing is easier and cheaper than doing a thorough, professional job. At least it’s cheaper if your aim is to go through the motions and do a low quality job at the lowest possible cost. You can fulfil your regulatory obligations to have an internal audit department without having to pay top dollar for skilled staff. The real costs of that cheapskate approach can be massive and long term.

What’s the risk?

Good auditing, like good testing, has to be risk based. You concentrate on the areas that matter, where the risk is greatest. You then assess your findings in the light of the risk and the business context of the application.

The questions I asked myself when I approached the audit of an application were;

“What must the application do, and how will I know that it’s doing it?”

“What must the application prevent, and how will I know that it really will stop it happening?”

“What controls are in place to make sure that all processing is on time, complete, accurate and authorised by the right people?”

The original requirements might be of interest, but if they were missing or hopelessly out of date it was no big deal. What mattered was the relationship between the current business environment and the application. Who cared if the application perfectly reflected the requirements if the world had moved on? What really mattered were the context and the risk.

Auditing with exploratory testing

Application audits were invariably time boxed. There was no question of documenting our planned tests in detailed scripts. We worked in a highly structured manner, but documentation was light. Once we understood the application’s context, and the functions that were crucial, we’d identify the controls that were needed to make sure the right things happened, and the wrong things didn’t.

I’d then go in and see if it worked out that way. Did the right things always happen? Could we force through transactions that should be stopped? Could we evade controls? Could we break the application? I’d approach each piece of testing with a particular mindset, out to prove a particular point.

It’s maybe pushing the point to call it true exploratory testing, but that’s only because we’d never even heard of it. It was just an approach that worked for us. However, in principle I think it was no different from exploratory testing and we could have done our job much better if we had been trained.

Apart from the fun and adrenalin rush of catching criminals on fraud investigations (no use denying it – that really was good fun) this was the best part of auditing. You’d built up a good knowledge of the application and its context, then you went in and tried to see how it really worked. It was always fascinating, and it was nothing like the stereotype of the checklist-driven compliance auditor. It was an engrossing intellectual exercise, learning more and applying each new piece of understanding to learn still more.

”It’s your decision – but this is what’s going on”

I have a horror of organisations that are in hock to their internal auditors. Such bodies are deeply dysfunctional. The job of good auditors is to shine a light on what’s really going on, not to beat people up for breaking the rules. It’s the responsibility of management to act on the findings and recommendations of auditors. It should never be the responsibility of auditors to tell people what they must do. It happens though, and corporations that allow it are sick. It means that auditors effectively take commercial decisions for which they are neither trained nor accountable.

It’s just like letting testers make the decision to ship, without being responsible for the commercial consequences.

Both testers and auditors are responsible for letting the right people have the best available information to make the right decisions. The context varies, and the emphasis is different, but many of the techniques are interchangeable. Of course auditors look at things that testers don’t, but these differences can still offer the chance to learn from each other.

For instance, auditors will pay close attention to the controls and procedures surrounding an application. How else could they judge the effectiveness of the application’s controls? They have to understood how they fit into the wider business if they are to assess the impact and risk of weaknesses. Maybe the broader vision is something testers could work on?

Why not try cultivating your internal audit department? If they’re any good you could learn something. If they’re no good then they could learn a lot from you!

Advertisements

6 thoughts on “Testers are like auditors

  1. Interesting & thought-provoking post, James.

    The type of auditing you describe here — software auditing — is essentially results-focused, like testing. The problem I see with many audit and quality assurance organizations (at least in North America) is that they focus almost exclusively on process and ignore results, thereby missing the point and often entirely bypassing the question of relative risks. The good ones will use their see and brains and base their questions on what they hear, rather than following checklists , but I fear they are rare.

  2. Thanks Fiona. This is a blog post rather than a serious, formal article, so I admit I’ve slightly over-egged the comparison to make my point and make people think. But basically I think it does hold true that there are fundamental similarities between good testing and good auditing.

    I carefully refrained from commenting on the relative balance of good and bad auditing, or results focussed versus process focussed auditing in your useful phrase. In my personal and rather limited experience it’s about 50/50. I hope that’s more accurate than your fear that good auditors are rare.

    It’s difficult to find out whether organisations are good or bad at auditing without spending time with them, or speaking to their people in a confidential setting. I’ve tried discussing this sort of thing on LinkedIn, but the organisations that do it badly don’t recognise it and so won’t admit it. The employees who do recognise the problem are unlikely to shoot their mouths off in public.

    As for your comment about North America, well that’s consistent with my experience, but I couldn’t argue that really confirms anything.

  3. James,
    I recently made a career shift from primarily software testing into a blended role that includes auditing.

    I have had similar experiences, and generally agree that your/Fiona observations are accurate. I’ve found that the auditing experiences have helped my testing, and vice-versa. Risk focused audits add value.

    I have also been surprised by how “self-inhibited” auditors are about discussing the practices and problems of the field. Software Testing is much more open to public debate.

  4. Thanks for that Griffin. It’s interesting what you say about auditors being self-inhibited. I recognise that. I think the psychology of IT audit is interesting.

    It’s an easy job to do if you’re not bothered about doing it well, but it’s a very difficult job to do well. When you’re inexperienced it’s a scary prospect to go in and review a technical area. Good judgement is crucial, because you need to understand clearly what you need to know, and what you don’t need to know.

    You have to be able to look senior professionals and managers in the eyes and say “I don’t know, tell me” without feeling embarrassed or threatened by your ignorance. You also have to be able to keep asking “why” and force people to address inconsistencies, ambiguities or evasions. It’s a bit like a courtroom cross-examination, but you can’t do it in an inquisitorial manner. Finally, the auditors can’t pretend they’ve a right to tell technical experts how to do their job. Trying that just invites humiliation and a loss of credibility.

    So it’s not surprising that auditors can be defensive or put on a show of bravado, or retreat into low value tick’n’bash process compliance auditing. I think many of them lack true confidence in what they’re doing, and how they’re doing it.

    I think it took me 2 years to feel comfortable in the role. Up till then every time I started an audit I worried I was heading out of my depth. After that I knew I was going out of my depth, but I knew how to handle it and I had the confidence to know it wouldn’t matter. Working as an auditor taught me a huge amount and helped me grow as a person. It gave me much greater self-confidence.

    I think IT audit has always had a problem that too many auditors have no practical IT experience. Some of them are just accountants who’ve been on a training course. It’s hard to make nuanced judgements if you don’t have a good understanding of what business IT involves, so you just fall back on “the process”.

  5. James I’ve only just seen this article – I was still on holiday when you posted it – but you raise many interesting points. Thank you.

    Rather than IT audit I have experience of management systems audit but I believe the principles still hold true. I now avoid approaching an audit with a checklist and instead take a copy of the process diagram from which I have gained an understanding of how the process was intended to work in the mind of the author. It is then that I probe to find out whether or not that is what is really happening. This has helped shape the way I test applications.

    Less experienced auditors and testers can benefit from the guidance a checklist or test script can give but they should be encouraged to go off on their own and explore to gain the tremendous job satisfaction that comes from a job well done – in which they and the business learn a lot about their systems and processes.

    Regards,

    Stephen

    • Thanks Stephen – I was really focussing on auditing existing applications, which is where the real overlap with testing lies. Other types of audit have less similarity, though there is still the crucial point that real testing and real auditing should both always involve intelligent investigation and judgement.

      Checklists are of limited value for auditing live applicaions. Each application is different and you have to understand what it is doing. A set of guidance about general points to look out for can be useful, but there’s no sensible scope for approaching the audit on a “yes/no| ticklist basis.

      Checklists are of far more value when you’re doing an audit of for example, network security, an operating system, or implementation and administration of a proprietary access control package. You’re only occasionally going to be expert in one of these before you start, so a checklist as part of an audit plan is very valuable. However, this is just guidance alerting you to what is important, sensitive and potentially risky. You still can’t accept yes/no answers. You do still have to keep asking “why” to understand what’s going on.

      So I agree that checklists are great when you’re inexperienced with the area under review, and the more standardised that area is the more likely it is that a checklist is useful.

      It’s maybe worth maintaining a distinction between a checklist for guidance, and a ticklist that can be reduced to an automated questionnaire with radio buttons for yes and no. That’s not auditing – or it shouldn’t be.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s