Too smart for checklists, and a consultants’ war?

My post last week “Why do we think we’re different” “Why do we think we’re different” (about goal displacement, trained incapacity and functional stupidity) attracted an interesting comment that deserved a considered response. In the end that merited a new post, rather than a reply in the comments section. Here is the comment, followed by my response.

I don’t disagree with your basic premise nor your conclusion. Your arguments are logical and elegant, but I wonder if people hear the anti-ISO 29119 group as saying “Don’t bother having process,” and “I’m too smart to need a checklist” (ala Gawande’s “Checklist Manifesto”).

Worse yet, the people arguing against the standard the loudest are not the common tester but the already heavily active in the community of testers. I described it as a “consultant’s war” because consultants seem the most active in the community are often consultants.

As a people problem, it seems both sides have a great deal of apathy outside of perhaps 1/100th of the testing community. How do we change that? The best argument in the world will have no affect if most people are apathetic. This is the problem I struggle with and not just around the standards but around testing as a career. I would love to see any insight you have around the problem, assuming you see it as a problem.

– JCD

JCD raises some interesting points and I am grateful for that.

I do need to keep thinking about the danger that people hear the Stop29119 campaigners as saying “don’t bother having process”. That is not the message I want to get across. I would phrase it as “beware of the dangers of prescriptive processes”, i.e. processes that spell out in detail each step. These may be required in some contexts, but not in software development or testing, where their value has been hugely overstated.

However, processes are required. The tricky problem is finding the sweet spot between providing sufficient guidance and ensuring appropriate standardisation on the one hand, and, on the other hand, going into excessive detail and preventing practitioners from using their judgement and initiative in a complex setting.

Similarly, checklists do have their place. They are important, but of limited value. They take you only so far. I certainly don’t decry them, but I do deplore excessive dependence on them at the expense, again, of judgement and initiative. I don’t disagree with the crux of Gawande’s book, though I do have reservations about the emphasis, or maybe it’s just with the way the book has been promoted. I’m also doubtful about his defintion of complexity.

I agree with what Gawande is saying here in The Checklist Manifesto.

It is common to misconceive how checklists function in complex lines of work. They are not comprehensive how-to guides, whether for building a skyscraper or getting a plan out of trouble. They are quick and simple tools aimed to buttress the skills of expert professionals. And by remaining swift and usable and resolutely modest, they are saving thousands upon thousands of lives.

I would quibble about building skyscrapers being a complex activity. I would call it highly complicated. I’d prefer to stick to the Cynefin definition of complexity, which is reserved for situations where there is no obvious causality; cause and effect are obvious only in hindsight. Nevertheless, I do like Gawande’s phrase that checklists are “quick and simple tools to buttress the skills of expert professionals”.

You’re right about this being a consultants’ war, but I don’t see how it could be anything else. It’s hard to get new ideas through to people who are doing the real work, grafting away day to day. Some testers are certainly apathetic, but the bigger issues are time and priorities. Campaigning against ISO 29119 is important, but it’s unlikely to be urgent for most people.

When I was a permanent test manager I didn’t have the time to lift my head and take part in these debates. Working as a contract test manager isn’t much better, though at least it’s possible, in theory, to control time between contracts.

I think that places some responsibility on those who can campaign to do so. If enough people do if for long enough then those testers who are less publicly visible, and lay people, are more likely to stumble across arguments that will make them question their assumptions. The lack of public defence of the standard from ISO has meant that anyone searching for information about ISO 29119 is now very likely to come across serious arguments against the standard. That might tilt the balance against general acceptance of the standard as The Standard. A few more people might join the campaign. Some organisations that might have adopted the standard may think better. It could all have a cumulative effect.

I’m working on the idea of a counter to ISO 29119. It wouldn’t be an attempted rival to ISO 29119. Nor would it be an anti-standard. Many organisations will adopt the standard because they want to seem responsible. That would be doing the wrong thing for the right reason.

What I am thinking of doing is documenting the links between the requirements of the Institute of Internal Auditors, and the Information Systems Audit & Control Association (and maybe other regulatory sources) with a credible alternative to the traditional, document heavy approach, showing how it is possible to be entirely responsible and accountable without going down the ISO 29119 dead end.

It’s a matter of explaining that there are viable and valid choices, something that is in danger of being hidden from public view by the arrival of an ISO standard intended to cover the whole of software testing. Checklists could well play a role in this.

I smiled at the suggestion that opponents of ISO 29119 might believe they’re too smart to need a checklist. I can see why some people might think that. However, I, like most ISO 29119 opponents I suspect, am acutely aware of the limits of my knowledge and competence. Excessive reliance on checklists and standards create the illusion, the delusion, that testers are more competent, professional and effective than they really are. I prefer a spot of realistic humility. Buttressing skill is something I can appreciate; supplanting skill is another matter altogether, and ISO 29119 goes too far in that direction.

Why do we think we’re different?

The longer my career lasts the more aware I am of the importance of Gerald Weinberg’s Second Law of Consulting (from his book “The Secrets of Consulting”), “No matter what the problem is, it’s always a people problem.”

The first glimmer of light that illuminated this truth was when I came across the term “goal displacement” and reflected on how many times I had seen it in action. People are given goals that aren’t quite aligned with what their work should deliver. They focus on the goals, not the real work. This isn’t just an incidental feature of working life, however. It is deeply engrained in our psychological make-up. There is a long history of academic work to explain this phenomenon.

Focal and subsidiary awareness

I’ll start with Michael Polanyi. In his book “Personal Knowledge”, Polanyi makes a distinction between focal and subsidiary awareness. Focal awareness is what we consciously think about. Subsidiary awareness is like tacit knowledge. We don’t think about the mechanics of holding a hammer to drive in a nail. We think about what we are trying to achieve. If we try to focus on the mechanics of holding the hammer correctly, and consciously aim for the nail then we are far more likely to make a painful mess of things. Focal and subsidiary awareness are therefore, in a sense, mutually exclusive. As Polanyi puts it.

“If a pianist shifts his attention from the piece he is playing to the observation of what he is doing with his fingers while he is playing it, he gets confused and may have to stop. This happens generally if we switch our focal attention to particulars of which we had previously been aware only in their subsidiary role.

Our attention can hold only one focus at a time… it would be hence contradictory to be both subsidiarily and focally aware of the same particulars at the same time.”

Does this apply in organisational life too, as well as to musicians and carpenters performing skilled physical activities? I think it does. We have often focused too closely on the process of software development and of testing and lost sight of the end we are trying to reach. Formal processes, prescriptive methods and standards encourage exactly that sort of misplaced focus.

Thorstein Veblen and trained incapacity

This problem of misplaced focus has long been observed by organisational psychologists and sociologists. A full century ago, in 1914, Thorstein Veblen identified the problem of trained incapacity.

People who are trained in specific skills tend to lose the ability to adapt. Their response has worked in the past, and they apply it regardless thereafter. They focus on responding in the way they have been trained, and cannot see that the circumstances require a different response. Their training has rendered them incapable of doing the job effectively unless it fits their mental framework. This is “trained incapacity”. They have been trained to be useless. The phenomenon applies to all workers, but especially to managers.

However, the problem that Veblen identified was worse than that. Highly specialised training and education meant that people were increasingly becoming expert in narrower fields and their areas of ignorance were increasing. When they entered the active workforce their jobs required wider skills and knowledge than their education had given them, but they were unable to contribute effectively to those other areas. They focussed on what they knew. Veblen was especially concerned about business school graduates.

“[These schools’] specialization on commerce is like other specializations in that it draws off attention and interest from other lines than those in which the specialization falls, thereby widening the candidate’s field of ignorance while it intensifies his effectiveness within his specialty. The effect, as touches the community’s interest in the matter, should be an enhancement of the candidate’s proficiency in all the futile ways and means of salesmanship and “conspiracy in restraint of trade” together with a heightened incapacity and ignorance bearing on such work as is of material use.”

A way of not seeing

In 1935 Kenneth Burke built on Veblen’s work, arguing that trained incapacity was;

“that state of affairs whereby one’s very abilities can function as blindnesses.”

People can focus on the means or the ends, not both, and their specific training in prescriptive methods or processes leads them to focus on the means. They do not even see what they are missing.

“A way of seeing is also a way of not seeing- a focus on object ‘A’ involves a neglect of object ‘B’.”

Robert Merton Robert Merton made the point more explicitly in 1957 when he introduced the concept of goal displacement.

“Adherence to the rules… becomes an end in itself… Formalism, even ritualism, ensues with an unchallenged insistence upon punctilious adherence to formalised procedures. This may be exaggerated to the point where primary concern with conformity to the rules interferes with the achievement of the purposes of the organization.”

Why do we think we are different?

So the problem had been recognised before software development was even in its infancy. How did it come to be such a pervasive problem in our profession? What possible reason could there be for thinking that we are different in software developing and testing? Why would we think we are immune from these problems?

I explored some of the reasons earlier this year in Teddy Bear Methods. Software development is difficult and stressful. It is tempting to seek refuge in neat, ordered structures. In that article I talked about social defences, transitional objects and how slavishly following prescriptive processes and methods can be become a fetish.

Functional stupidity

However, there is an over-arching explanation; functional stupidity. This was identified by Alvesson and Spicer in a fascinating paper in the Journal of Management Studies in 2012, ”A Stupidity-Based Theory of Organizations” (PDF, opens in new tab).

The concept is rather more nuanced than the headline grabbing name suggests. It is no glib piece of cod psychology; it is soundly rooted in organisational psychology and sociology and in management theory.

Organisations can function more smoothly if employees suspend their critical thinking faculties. It can actually be beneficial if they do not question the validity of management directives, if they don’t think about whether the actions they have to take are justified, and if they don’t waste cognitive effort thinking about whether their work is aligned with the objectives of the organisation.

In large organisations the goal towards which many employees are working is effectively the smooth running of the bureaucracy. Functional stupidity does help things run smoothly. It can be beneficial for compliant employees too. The people who thrive are those who play the game by the rules and don’t question whether the “game” is actually aligned with the objectives of the organisation.

However, where functional stupidity is beneficial it is in organisations operating in a fast moving, relatively well understood environment. In these cases fast and efficient action and reaction may be more important than reflective analysis, though it still carries serious dangers.

On the other hand if the environment is less well understood and there is a need to reflect and learn, then functional stupidity can be disastrous. Apart from a failure to learn from, or even detect mistakes, functional stupidity can commit the organisation to damaging initiatives, while corroding employee morale and effectiveness. Even if the organisation as a whole might be suited to functional stupdity there are roles where it is entirely inappropriate.

Software testing is exactly such a role. Testers must question, analyse, reflect and learn. These are all activities that functional stupidity discourages.

Management fads, lack of evidence and ISO 29119

Alvesson and Spicer refer to a further, damaging effect of functional stupidity that has particular relevance to the debate about ISO 29119. They argue that managers are prone to getting caught up in enthusiasm for unproven initiatives.

”Most managerial practices are adopted on the basis of faulty reasoning, accepted wisdom, and complete lack of evidence.

…organizations will often adopt new practices with few robust reasons beyond the fact that they make the company ‘look good’ or that ‘others are doing it’… Refraining from asking for justification beyond managerial edict, tradition or fashion, is a key aspect of functional stupidity.”

Does ISO 29119 fall into this category? Dr Stuart Reid, convener of the ISO 29119 Working Group is a surprising source of compelling evidence to support the claim.buyers unclear
no evidence He has conceded that there is no evidence of the standard’s efficacy and that the people who buy testing services do not understand what they are buying (see the slides from his presentation at ExpoQA14 in Madrid in May, with my added emphasis).

Yet he hopes that they will nevertheless write contracts that mandate the use of the standard (PDF, opens in new tab).

This standard will impose on testers working practices that are only loosely aligned with the real objective of testing. It will provide fertile breeding grounds for goal displacement. Will functional stupidity ease the way for ISO 29119? I fear the worst.

I asked why we think we are different in software development and testing. The question is poorly framed. It’s not that we think we are different. The problem is that we, as a global testing community, are not thinking enough. Far too many of us are simply going with the flow. Thousands have unthinkingly adopted functional stupidity as a career move. ISO 29119? That will do nicely.

“No matter what the problem is, it’s always a people problem.”

Any organisational initiative, or new methodology, or new standard that ignores that rule will not work. The lessons have been there for decades. We only have to look for them.