This article follows on from yesterday’s, “Best practice, or just okay practice?”.
One of the most superficially impressive justifications for “best practice” is that it protects the interests of clients and users. I really don’t accept that “best practice” is the best, or even an effective way to do it. This isn’t just a theoretical position. Nor have I been swayed by the Rapid Software Testing people and gone looking for confirmation in practice. I agree with most of what they say because it matches, and helps explain, the experience that I’d already acquired.
I lost faith in “best practice” because of what I saw out in the field. The objections in my previous piece were largely academic, but very real. I will tell you some of the experiences that have shaped my views.
I have seen customers getting a lousy service because the supplier was insisting on inappropriate “best practice”. The client staff who were responsible for the contract were going along with that because it protected them if things went wrong. The attitude could be summed up as “Ok, it turned out badly, but we were following best practice, so I guess we were just unlucky. No-one did anything wrong, nothing to learn, we’ll do it all the same way next time, we’re sure to be luckier”.
I have seen places where people were doing things that almost guaranteed failure because they believed it was better to adhere rigidly to “best practice” than make the calculated compromises, i.e. take intelligent risks, that were necessary to be successful. I’ve even seen cases where individual success or failure was judged more by adherence to “best practice” than by results. It was a far better career move to deliver late and lose money while following “best practice” than to please the client while taking a more selective approach.
I’ve seen supplier staff doing utterly pointless work that infuriated the client because the supplier was insisting on best practice that bore no relation to the risks facing the client. A drugs manufacturer who is subject to FDA regulation faces very different problems from a retailer selling consumer goods.
Trying to apply the same laborious “best practice” to both can mean the retailer can’t move fast enough in the market, and thus introduces serious commercial risks that far outweigh the technology risks on which the IT supplier is focused.
That retailer client was exasperated because key supplier staff were tied up doing work the client thought was unnecessary and refused to pay for. Meanwhile work that the client was desperate to start was being delayed. So the client couldn’t do commercially important work, the supplier lost money doing work it couldn’t invoice to the client. The supplier staff also hated the unnecessary work because it was so tedious and they openly doubted the value of it. But “best practice” was followed.
It was a lose-lose situation, but the supplier management felt they had to do it because they feared the consequences of neglecting “best practice”. Everyone was unhappy, everyone was frustrated. Everyone lost money.
Another troubling example of “best practice” causing problems was when a company needed to get a product to the market very quickly for commercial reasons. The internal developers had to build the supporting application to a brutally tight timetable.
There were arguments with the users, who were stalling over agreeing the detailed requirements. So the developers chose to ignore “best practices” and actually developed the application in full without the requirements being signed off. Was that best practice? Of course not. Naturally there were problems, and the user management tried to blame the developers for not adhering to basic best practice.
The developers’ management argued that there was a pressing commercial need to implement the application and that the users had been playing a political game to try and ensure that they would not be blamed for the inevitable problems. The developers were very experienced with that business area and took the decision to press ahead and build the system while the users were wasting time.
Would the developers have been right to stand by “best practice” to protect their skins, even if it was damaging to the company? I don’t think so. That isn’t just a personal opinion. I had to give a professional opinion on the matter.
I was called in as an auditor to review the project. Everyone assumed I would kick the developers for not following “best practice”. Instead I praised them for doing a good job in unreasonable circumstances and I criticised the users for being obstructive. I thought the attitude of the users was lamentable, though I was careful with the wording of my report. They were cynical and devious, keen to blame everyone but themselves. They were relying on “best practices” to try and absolve themselves from any blame. My report was accepted by senior management and I think it contributed towards an improvement in working practices.
Ok, that was an extreme case of irresponsible abuse of “best practice”, but it neatly highlights one of the flaws with the concept.
If commercial pressures, the needs of the users, the risk to the company and conventional “best practice” are inconsistent then what gives way? I’m not saying that developers or testers should abandon all responsible, professional behaviour the moment someone senior shouts “jump”.
What I am saying is that the problems of IT are much more complicated and subtle, messy even, than can be resolved by stating that certain techniques or processes or standards are “best practices”. Contracts written by lawyers who cite “best practice” with no real understanding of what work might be required do not offer real protection to clients. They can even hurt the people they are meant to protect.
What is best for clients, and for users, is the best people available being allowed to do the best work they can. That is a trickier matter than writing a contract or defining a new standard. If you feel you really need to rely on contracts, or “best practice” defined by other people, you’re probably on a loser anyway.