Testing inside the box?

The EuroSTAR online softare testing summit took place yesterday. I missed the morning, but there were a few things that leapt out at me in the afternoon.

When Fiona Charles was talking about test strategies she said, “we always have a model, whether conscious or unconscious”.

Keith Klain later talked about the dangers of testers trying to provide confidence rather than information. Confidence should never be the goal of testing. If we try to instil confidence then we are likely to be reinforcing biases and invalid assumptions.

In between Fiona’s and Keith’s talks I came across this quote from the Economist about the reasons for the 2007/8 financial crash.

With half a decade’s hindsight, it is clear the crisis had multiple causes. The most obvious is the financiers themselves – especially the irrationally exuberant Anglo-Saxon sort, who claimed to have found a way to banish risk when in fact they had simply lost track of it.

These three points resonated with me because over the last day or so I have been thinking about the relationship between risk, our view of the world and our inbuilt biases as part of my upcoming EuroSTAR tutorial.

During the tutorial I want to discuss a topic that is fundamental to the work that testers and auditors do. We might not think much about it but our attitudes towards ontology and epistemology shape everything that we do.

Don’t switch off! I’m only going to skim the surface. I won’t be getting heavy.

Lessons from social science

Ontology means looking at whether things exist, and in what form. Epistemology is about how and whether we can know about anything. So “is this real?” is an ontological question, and “how do I know that it is real, and what can I know about it?” are epistemological questions.

Testing and auditing are both forms of exploration and research. We have to understand our mindset before we go off exploring. Our attitudes and our biases will shape what we go looking for. If we don’t think about them beforehand we will simply go and find what we were looking for, or interpret whatever we do find to suit our preconceptions. We won’t bother to look for anything else. We will not even be aware that there is anything else to find.

Both testing and auditing have been plagued by an unspoken, unrecognised devotion to a positivist outlook. This makes the ontological assumption that we are working in a realm of unambiguous, objective facts and the epistemoligical assumption that we can observe, measure and conduct objective experiments (audits or tests) on these facts. Crucially this mindset implicitly assumes that we can know all that is relevant.

If you have this mindset then reality looks like the subject of a scientific experiment. We can plan, conduct and report on our test or audit as if it were an experiment; the subject is under our control, we manipulate the inputs, check the output against our theory and reach a clear, objective conclusion.

If we buy into the positivist attitude then we can easily accept the validity of detailed test scripts and audit by checklist. We’ve defined our reality and we run through binary checks to confirm that all is as it should be. Testing standards, CMMI maturity levels, certification all seem natural progressions towards ever greater control and order.

We don’t need to worry about the unknown, because our careful planning and meticulous documentation have ruled that out. All we need to do is run through our test plan and we will have eliminated uncertainty. We can confidently state our conclusions and give our stakeholders a nice warm glow of confidence.

Reality in business and software development is nothing like that. If we adopt an uncritical positivist approach we have just climbed into a mental box to ensure that we cannot see what is really out there. We have defined our testing universe as being that which is within our comprehension and reach. We have redefined risk as being that which is within our control and which we can manage and remove.

Getting out of the box

Perhaps the testers, and certainly the auditors, at the banks that crashed failed to realise they were in a world in which risk had not been banished. It was still out there, bigger and scarier than they could ever have imagined, but they had chosen to adopt a mindset that kept them in a neat and tidy box.

When you forget about the philosophical and social science jargon it all comes down to whether you want to work in the box, pretending the world suits your worldview, or get out of the box and help your stakeholders understand scary, messy reality.

Social science researchers know the pitfalls of unconsciously assuming a worldview without realising how it shapes our investigation. Auditors are starting to realise this and discuss big questions about what they can know and how they can know it. Testers? Well, not all of them, not yet, but we all need to start thinking along those lines. That’s part of the message I hope to get across at Gothenburg and elsewhere over the coming months.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s