As the years have passed I’ve noticed something interesting about the way my knowledge of IT and my attitudes have changed.
When I started out I was painfully aware that I knew nothing, and that there was a whole vast universe of knowledge out there. The more I learnt, the more I realised there was to learn.
As I became more competent I started to get cocky about the little areas that I was proficient in. I could do clever things with IBM mainframe utilities, with VM/CMS execs, and with SAS routines and macros. I remained duly humble about the rest of IT.
When I started to get involved in big, conventional developments I was hugely impressed by the size and complexity of these applications. I quickly realised that such developments were enormously difficult and that there was the constant danger of screwing up.
As a humble novice I largely accepted the prevailing wisdom of the times, and tried to understand it better. Yet, there were some things I didn’t really get, and which I had to accept largely on trust, assuming that greater experience would give me a deeper, truer understanding.
As the years passed I gradually realised that my youthful puzzlement and suspicion had been well justified. I’d actually known more than I realised, but I’d been the little boy who didn’t cry out that the emperor was wearing no clothes!
What were the things I didn’t get?
I remember looking on in wonder at the squads of test analysts who were working long hours churning out scripts by the hundreds, thousands even. They were in every weekend, months before the start of testing.
How could they be confident that the application would actually be delivered to match the specifications they were working from? What about the changes? How could they anticipate everything before they saw the application running? If they missed something from their spec would they really improvise appropriately when they executed the testing?
It seemed strange to me, but I accepted that these were the experts. It might look hugely inefficient, but obviously that had to be the most effective way to ensure that testing was as thorough as it had to be. No!
Getting the requirements right
That leads on to the next thing that left me uncomfortable as a beginner. The process of deriving the requirements was horribly prolonged and laborious. Even assuming that the business didn’t change, it hardly seemed plausible that the analysts could get all the enormous detail right and documented precisely before the programming could start. But then what did I know? The analysts were obviously more experienced, patient and painstaking than I was.
Well, they were, but that didn’t mean they were getting it right. It was only years later that I realised that the problem wasn’t that we were deriving the requirements badly, The problem was that you cannot get the requirements right first time, and the users cannot specify requirements till they have seen what is possible.
The process of building the application changes the requirements, and good development practices have to accept that, rather than pretending that the users are feckless and undisciplined.
Code it all, then test it all
When I was developing small scale applications in tiny teams we didn’t follow formal standards. We worked the way that suited us.
My preferred technique was to work out what I had to do, splitting the work into discrete chunks. Then I’d think through how I’d get each bit to work. I wasn’t thinking “test before I code”. It was just that it seemed natural to know before I started coding how I’d be able to satisfy myself the code was working; to make sure I’d code something I knew I’d be happy with, rather than sometime that would leave me wondering about the quality.
So I’d build a bit, test a bit, and gradually extend my programs into something useful, then the programs into the full application. I never thought of it as testing. It was just easy coding.
I could see that the developers of the big applications didn’t do it that way. They’d churn out whole programmes before testing, zip through a few unit tests and bolt things together into an integration test. I just accepted that my way was only possible with the small apps. The big boys had to do it differently.
Incidentally, my training had left me deeply committed to structure and code quality. After writing a program I’d always go back through it to remove redundant code, ensuring that common processing was pulled together into common routines and modules. Elegance was everything, spaghetti code was an abomination. It took a while for me to twig that refactoring wasn’t some fancy new trick. It was just good coding.
Structured design (boo!)
I was sent on various courses on structured techniques. Structured analysis made sense, more or less. At least, I could follow the internal logic of the whole thing, though I wasn’t entirely convinced about whether it really worked in practice.
The thing that I really didn’t get, however, was how you derived the outline design from the requirements. It seemed suspiciously like you relied on your own personal judgement, guesswork even. The design didn’t seem to flow logically and inevitably from the precision and detail of the structured analysis. I thought I was maybe not as smart as I thought, and that all would become clearer in time. I might be winging it using intuition, but the experts were doing it properly.
It never did become clearer, and I was amazed to discover years later that structured techniques had been rumbled. The founders of structured techniques really had been relying on intuition. There was no empirical basis to their work, and there was a genuine discontinuity between the requirements and the design. See my article on testing and standards from the December 2009 edition of Testing Experience magazine for an explanation, and some references. The relevant part is “IEEE829 & the motivation behind documentation”.
The lesson from this is definitely not that I was an unrecognised genius. Nor is the lesson that youngsters probably know better than the old hands who’ve got set in their ways. Bright youngsters have a lot to offer by asking penetrating questions, and forcing experienced staff to explain and justify why things are done the way they are. But that’s not my point.
The real point is that when I was working in these small teams, without heavyweight processes and standards, we defaulted to a natural way of developing software. The Agile movement goes with the grain of such natural development practices. Structured techniques chased after the chimera of software development as a variant of civil engineering. In reality they were no more than a mutant that was neither engineering nor good software development.
I’m not dismissing standards and processes, but they should be guidance, rather than rules. They should promote good, natural practices rather than the rigid, sclerotic approaches that attempted to resolve problems by churning out more documents. It’s a shame that I had to figure things out for myself as a beginner, rather than being guided by useful standards.
It’s been great to see the way that the more organic, natural approaches such as Agile and Exploratory Testing have energised the industry. They might not always be right or appropriate, but they do reflect deep truths about the right way to produce good software. Any plausible approach has to take these truths on board. Ignoring them guarantees waste, repeated failure, demoralised developers and cynical users. Twenty years ago, that was pretty much business as usual!
Have you ever instinctively known better as a beginner than you realised? Did you have the nerve to criticise the emperor’s clothing?