Permission to think

This is the third post, or essay, in a series about how and why so many corporations became embroiled in a bureaucratic mess in which social and political skills are more important than competence.

In my first post “Sick bureaucracies and mere technicians” I talked about Edward Giblin’s analysis back in the early 1980s of the way senior managers had become detached from the real work of many corporations. Not only did this problem persist, but it become far worse.

In my second post, “Digital Taylorism & the drive for standardisation“, I explained how globalisation and technical advances gave impetus to digital Taylorism and the mass standardisation of knowledge work. It was widely recognised that Taylorism damaged creativity, a particularly serious concern with knowledge work. However, that concern was largely ignored, swamped by the trends I will discuss here.

The problems of digital Taylorism and excessive standardisation were exacerbated by an unhealthy veneration of the most talented employees (without any coherent explanation of what “talent” means in the corporate context), and a heightened regard for social skills at the expense of technical experience and competence. That bring us back to the concerns of Giblin in the early 1980s.

10,000 times more valuable

Corporations started to believe in the mantra that talent is rare and it is the key to success. There is an alleged quote from Bill Gates that takes the point to the extreme.

A great lathe operator commands several times the wage of an average lathe operator, but a great writer of software code is worth 10,000 times the price of an average software writer.

Also, this is from “The Global Auction” (the book I credited in my last post for inspiring much of this series).

… being good is no longer good enough because much of the value within organizations is believed to be contributed by a minority of employees. John Chambers, CEO of Cisco, is reported as saying, “a world-class engineer with five peers can out-produce 200 regular engineers“. A Corporate Executive Board (CEB) study also found that the best computer programmers are at least 12 times as productive as the average.

These claims raise many questions about the meaning of terms such as “value”, “great”, “worth”, “world-class”, “out-produce” and “productive”. That’s before you get to questions about the evidence on which such claims are based. 10,000 times more valuable? 33 times? 12 times?

I was unable to track down any studies supporting these claims. However, Laurent Bossavit has persevered with the pursuit of similar claims about programmer productivity in his book The Leprechauns of Software Engineering“. Laurent’s conclusion is that such claims are usually either anecdotal or unsubstantiated claims that were made in secondary sources. In the few genuine studies the evidence offered invariably failed to support the claim about huge variations in programmer productivity.

The War for Talent

The CEB study, referred to above, claiming that the best programmers are at least 12 times more productive than the average was reminiscent of one study that did define “productive”.

The top 3% of programmers produce 1,200% more lines of code than the average; the top 20% produce 320% more lines of code than the average.

I’m not sure there’s a more worthless and easily gamed measure of productivity than “lines of code”. No-one who knows anything about writing code would take it seriously as a measure of productivity. Any study that uses it deserves merciless ridicule. If you search for the quote you will find it appearing in many places, usually alongside the claims you can see in this image.Midtown Madness

I have lifted it from a book about hiring “top talent”, “Topgrading (how leading companies win by hiring, coaching and keeping the best people”. The original source for all these claims is the work that the famous consulting firm of McKinsey & Co carried out in the late 1990s. McKinsey’s efforts were turned into an influential, and much cited, book, “The War for Talent“.

The War for Talent argued that there are five “imperatives of talent management”; believing that corporate success is a product of the individual brilliance of “A players”, creating a culture in which superstars will flourish, doing whatever it takes to hire the superstars of the future, accelerating their development, and ruthless differentiation between the superstars and the other employees. The stars should get rewarded lavishly. Of the rest, the also rans, the “B players” should receive modest rewards and the poorer performers, the “C players”, should be dismissed.

Not surprisingly McKinsey’s imperatives have been controversial. The widespread influence of the “War for Talent” attracted unfavourable critical interest. Brown, Lauder and Ashton were politely sceptical about its merits in “The Global Auction”. However, they were more interested in its influence than its reliability. Malcolm Gladwell savaged The War for Talent in a long, persuasive article in the New Yorker (which incidentally is well worth reading just for its discussion of anti-submarine tactics in the Second World War). Gladwell made much of the fact that a prize exemplar of the McKinsey approach was one of its most prominent clients, Enron. The freedom and reckless over-promotion that Enron awarded “the smartest guys in the room” were significant factors in Enron’s collapse. The thrust of Gladwell’s argument resonated with my experience of big corporations; when they thrive it is not because of untrammelled individual brilliance, but because they create the right environment for all their employees to work together effectively.

Andrew Munro of AM Azure Consulting (a British HR consultancy) went much further than Gladwell. In “What happened to The War For Talent exemplars?” (PDF, opens in new tab) he analysed the companies cited in The War for Talent and their subsequent experience. Not only did they seem to be selected initially for no other reason than being McKinsey clients, the more praise they received in the book for their “talent management” the less likely they were to succeed over the following decade.

Munro went into considerable detail. To summarise, he argued that the McKinsey authors started with flawed premises, adopted a dodgy method, went looking for confirmation of their preconceptions, then argued that their findings were best practice and generally applicable when in reality they were the opposite.

The five imperatives don’t even seem applicable to the original sample of U.S. firms. Not only has this approach failed to work as a generic strategy; it looks like it may have had counter-productive consequences

Again, as with the idea that tacit knowledge can be codified, the credibility of the War for Talent is perhaps of secondary importance. What really matters is the influence that it has had. Not even its association with the Enron disaster has tainted it. Nevertheless, it is worth stressing that the most poorly evidenced and indeed damaging strategies can be adopted enthusiastically if they suit powerful people who will personally benefit.

This takes us back to Giblin in the early 1980s. He argued that senior managers were increasingly detached from the real work of the organisation, which they disparaged, because they were more reliant on their social skills than on knowledge of what that real work entailed. As I shall show, the dubious War for Talent, in conjunction with digital Taylorism, made a serious problem worse.

Permission to think

A natural consequence of digital Taylorism and a lauding of the most “talented” employees is that corporations are likely to segment their workforce. In the Global Auction, Brown, Lauder and Ashton saw three types of knowledge worker emerging: developers, demonstrators, and drones.

Developers include the high potentials and top performers… They represent no more than 10–15 percent of an organisation’s workforce given “permission to think” and include senior researchers, managers, and professionals.

Demonstrators are assigned to implement or execute existing knowledge, procedures, or management techniques, often through the aid of software. Much of the knowledge used by consultants, managers, teachers, nurses, technicians, and so forth is standardised or prepackaged. Indeed, although demonstrator roles may include well-qualified people, much of the focus is on effective communication with colleagues and customers.

Drones are involved in monotonous work, and they are not expected to engage their brains. Many call center or data entry jobs are classic examples, where virtually everything that one utters to customers is pre-scripted in software packages. Many of these jobs are also highly mobile as they can be standardized and digitalized.

“Permission to think”? That is an incendiary phrase to bring into a discussion of knowledge work, especially when the authors claim that only 10-15 percent of employees would be allowed such a privilege. Nevertheless, Brown, Lauder and Ashton do argue their case convincingly. This nightmarish scenario follows naturally if corporations are increasingly run by managers who see themselves as a talented elite, and they are under pressure to cut costs by outsourcing and offshoring. That requires the work to be simplified (or at least made to appear simpler) and standardised – and that is going to apply to every activity that can be packaged up, regardless of whether its skilled practitioners actually need to think. Where would testers fit into this? As demonstrators at best, in the case of test managers. The rest? They would be drones.

Empathy over competence

I could have finished this essay on the depressing possibility of testers being denied permission to think. However, digital Taylorism has another feature, or result, that reinforces the trend, with worrying implications for good testing.

As corporations attempted to digitise more knowledge and package work up into standardised processes, the value of such knowledge and work diminished. Or rather, the value that corporations placed on the people with that knowledge and experience reduced. Corporations have been attaching more value to those who have strong social skills rather than traditional technical skills. Brown, Lauder and Ashton quoted at length in The Global Auction the head of global HR at a major bank.

If you are really going to allow people to work compressed hours, work from home, then work needs to be unitised and standardised; other-wise, it can’t be. And as we keep pace with careers, we want to change; we don’t want to stay in the same job for more than 2 years max. They want to move around, have different experiences, grow their skills base so they’re more marketable. So if you’re moving people regularly, they have to be easily able to move into another role. If it’s going to take 6 months to bring them up to speed, then the business is going to suffer. So you need to be able to step into a new role and function. And our approach to that is to deeply understand the profile that you need for the role — the person profile, not the skills profile. What does this person need to have in their profile? If we look at our branch network and the individuals working at the front line with our customers, what do we need there?

We need high-end empathy; we need people who can actually step into the customers’ shoes and understand what that feels like. We need people who enjoy solving problems… so now when we recruit, we look for that high-end empathy and look for that desire to solve problems, that desire to complete things in our profiles…. we can’t teach people to be more flexible, to be more empathetic… but we can teach them the basics of banking. We’ve got core products, core processes; we can teach that quite easily. So we are recruiting against more of the behavioural stuff and teaching the skills stuff, the hard knowledge that you need for the role.

Good social skills are always helpful, indeed they are often vital. I don’t want to denigrate such skills or the people who are good at working with other people. However, there has to be a balance. Corporations require people with both, and it worries me to see them focussing on one and dismissing the other. There is a paradox here. Staff must be more empathetic, but they have to use standardised processes that do their thinking for them; they can’t act in a way that recognises that clients have special, non-standard needs. Perhaps the unspoken idea is that the good soft skills are needed to handle the clients who are getting a poor service?

I recognise this phenomenon. When I worked for a large services company I was sometimes in a position with a client where I lacked the specific skills and experience I should really have had. A manager once reassured me, “don’t worry – just use our intellectual capital, stay one day ahead of the client, and don’t let them see the material you’re relying on”. I had the reputation for giving a reassuring impression to clients. We seemed to get away with it, but I wonder how good a job we were really doing.

If empathy is more valuable than competence then that affects people throughout the corporation, regardless of how highly they are rated. Even those who are allowed to think are more likely to be hired on the basis of their social skills. They will never have to acquire deep technical skills or experience, and what matters more is how they fit in.

In their search for the most talented graduates, corporations focus on the elite universities. They say that this is the most cost-effective way of finding the best people. Effectively, they are outsourcing recruitment to universities who tend to select from a relatively narrow social class. Lauren Rivera in her 2015 book “Pedigree: How Elite Students Get Elite Jobs” quotes a banker, who explains the priority in recruiting.

A lot of this job is attitude, not aptitude… Fit is really important. You know, you will see more of your co-workers than your wife, your kids, your friends, and even your family. So you can be the smartest guy ever, but I don’t care. I need to be comfortable working every day with you, then getting stuck in an airport with you, and then going for a beer after. You need chemistry. Not only that the person is smart, but that you like him.

Rivera’s book is reviewed in informative detail by the Economist, in an article worth reading in its own right.

Unsurprisingly, recruiters tend to go for people like themselves, to hire people with “looking-glass merit” as Rivera puts it. The result, in my opinion, is a self-perpetuating cadre of managerial clones. Managers who have benefited from the dubious hero-worship of the most talented, and who have built careers in an atmosphere of digital Taylorism are unlikely to question a culture which they were hired to perpetuate, and which has subsequently enriched them.

Giblin revisited

In my first post in this series, “Sick bureaucracies and mere technicians“, I talked about Edward Giblin’s concerns in 1981 about how managers had become more concerned with running the corporate bureaucracy whilst losing technical competence, and respect for competence. I summarised his recommendations as follows.

Giblin had several suggestions about how organisations could improve matters. These focused on simplifying the management hierarchy and communication, re-establishing the link between managers and the real work and pushing more decision making further down the hierarchy. In particular, he advocated career development for managers that ensured they acquired a solid grounding in the corporation’s business before they moved into management.

34 years on we have made no progress. The trend that concerned Giblin has been reinforced by wider trends and there seems no prospect of these being reversed, at least in the foreseeable future. In my last post in this series I will discuss the implications for testing and try to offer a more optimistic conclusion.

Advertisements