Thinking Fast and Slow for Soldiers

John Wilson

Thinking Fast and Slow for Soldiers
To cite this article: Wilson, John, “Thinking Fast and Slow for Soldiers”, Military Operations, Volume 2, Issue No. 2, Spring 2014, pages 26-29.

(Based on Daniel Kahneman’s Book – Allen Lane, 2011)

Daniel Kahneman’s book, Thinking Fast and Slow, is a best seller, and deservedly so; although I wonder how many buyers have read it from beginning to end. It is very well written and mixes abstract with concrete using many examples to make the point. But I won’t pretend to have fully understood it or to have deduced as much from it as I should have done. After all, Daniel Kahneman is a Nobel Prize winning economist and a distinguished psychologist, and I am not. There is enough for me to show that this is a work that some soldiers should make the effort to read and apply.

Kahneman aims the book at the gossiper by the office water-cooler, or in the British Army the idle chat at coffee break – a vital feature of military life and sadly neglected by the ignorant and the managerial. He wants to “enrich the vocabulary that people use when they talk about the judgments and choices of others, the company’s new policies, or a colleague’s investment decisions”. “…It is much easier, as well as far more enjoyable, to identify the mistakes of others than to recognize our own.” The book is in five parts:

Part 1 – presents the basic elements of a two systems approach to judgment and choice.

Part 2 – updates the study of judgment heuristics.

Part 3 – describes the difficulties of statistical thinking.

Part 4 – is a conversation with the discipline of economics on the nature of decision-making and on the assumption that economic agents are rational.

Part 5 – describes a distinction between two selves – the remembering self and the experiencing self, which do not have the same interests.

From this remarkable book I will draw some observations that seem to me to be helpful for soldiers: soldiers as tacticians, as leaders, as trainers, as organizers and administrators, and as project managers.

At the centre of his work is the idea of Systems 1 and 2. System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice and concentration. Kahneman stresses that these two systems do not really exist in the brain or anywhere else. System 1 is a shortcut for “X occurs automatically”. And “System 2 is mobilized to do Y” is a shortcut for “arousal increases, pupils dilate, attention is focused, and Y is performed”. When you drive on the motorway and you steer a gentle curve – you use System 1, you can continue a conversation or listen to the radio at the same time. When you negotiate Hangar Lane on the A4 in West London, you invoke System 2: you concentrate, you calculate, the radio may still be on but you are not hearing it, you stop talking.

The Appreciation or Estimate

Dorman-Smith’s appreciation for Auchinleck’s Alamein battle was the product of System 2: clear and analytical. Macarthur’s decision to land at Inchon started with System 1. A soldier taught the combat estimate is told to invoke System 2; in practice the experienced junior leader will use mostly System 1. And he can see the tactical solution because he recognizes it.

Kahneman worked with Gary Klein to investigate how fireground commanders could make good decisions without comparing options – which we call “Courses Open”. “The initial hypothesis was that commanders would restrict their analysis to only a pair of options…in fact, the commanders generated only a single option, and that was all they needed. They could draw on the repertoire of patterns that they had compiled during more than a decade of real and virtual experience to identify a plausible option, which they considered first. They evaluated this option by mentally simulating it to see if it would work in the situation they were facing… If the course of action they were considering seemed appropriate they would implement it. If it had shortcomings they would modify it. If they could not easily modify it, they would turn to the next most plausible option and run through the same procedure until an acceptable course of action was found”.

Klein called this recognition-primed decision. In case you are wondering, the fireground commanders’ approach was sound. And it makes sense if I quote Herbert Simon’s definition of intuition: “The situation has provided a cue; this cue has given the expert access to information stored in the memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition”.

To get to this level is a skill, it takes time and application. It is not a single skill but the acquisition of many mini-skills. And for these intuitive skills to be valid there are two basic conditions:

  • An environment that is sufficiently regular to be predictable.
  • An opportunity to learn these regularities through prolonged practice.

The obvious objection to the use of intuition in the tactical battle is the unpredictability of the environment. Yet recognition still plays a huge part. Commanders develop a feel, they see patterns, they become aware of the dogs that don’t bark, and they know well their own forces and may know well the ground. They also know what they can and cannot do – especially the latter. The challenge for armies is to develop their commanders through regular practice.

At Goose Green (Falklands War 1982), the commanding officer of 2 PARA did his combat estimate (System 2) and produced a 7-phase plan. His thinking was in line with current British Army practice. However, at least one of his more experienced company commanders would probably have opted for the simple battle plan – an advance to contact with one company up – with a high element of System 1. The battalion won its battle because of the quality of its fighting men and despite the plan. British Army commanding officers do not undergo the type of preparation that submarine commanders do – where swift and accurate execution is demanded and where failure on test is punished. As an army we under-estimate the value of repetition and practice. We regularly declaim that training is the opportunity to make mistakes and learn; sadly we are unable to consistently distinguish between practice and testing.

I know the traditional idea is that after a spell at unit level officers alternate between staff and command tours, but there is still a general approach of learning on the job and absorption of tactical nous by osmosis rather than systematic instruction and practice. Whilst operations in Afghanistan and Iraq have concentrated the mind, they are theatre-focused. There is no sign of regular and demanding practice of tactical exercises at all levels. Yet the finding of those who study intuition is that valid skill only comes from dedicated and systematic practice over time.

Statistics

Kahneman has a neat example of statistical thinking from his time with the Israeli Defence Forces (IDF). He gave a speech to the Israeli Air Force about the importance in skill training that rewarding improvement is superior to punishing mistakes. A seasoned instructor pointed that many times he had praised cadets for a skilful manoeuvre but that the next time they invariably performed more poorly, and that when he had screamed at a trainee for a bad manoeuvre, in general he did better the next time. The instructor’s observation was astute but his inference was wrong. What he had observed was a regression to the mean. Which in that case was due to random fluctuations in the quality of performance.

Understanding statistics is an essential part of a staff officer’s job and without it s/he cannot assess risk let alone provide the data for decision-making. There is much more in Kahneman’s book that can be usefully gleaned.

Selection

Intriguingly, Kahneman was part of a group that assessed officer candidates. And the methods the IDF used were those employed by the British Army: of observation on command tasks – obstacles that required a team to solve and execute without nominating a leader. The observation team assiduously recorded their findings, which showed to their satisfaction how well they understood the candidates and their suitability as leaders. Their impressions of how well each soldier had performed were generally coherent and clear, formal predictions were just as definite. They rarely experienced doubts and were willing to declare, “that fellow is mediocre, this one will never make it, that one is a star”. Yet they knew with certainty that their predictions were largely useless. The feedback from the officer training school was that their ability to predict performance at the school was negligible. Yet even knowing that when the next batch of candidates arrived, assessment began again, spirits lifted, and again it was clear that their true natures were revealed. Kahneman describes it as the illusion of validity.

Now, in case you think we know so much better 50 years later, let me give you some recent figures for entry for Sandhurst: looking at the one hundred officer cadets (O/Cdts) commissioning, those identified as carrying some risk from Army Officer Selection Board (AOSB) into the commissioning course at Royal Military Academy Sandhurst commissioned as follows:

Top third has 7 of the 28 O/Cdts at risk from AOSB.

Middle third has 12 of the 28 O/Cdts at risk from AOSB.

Bottom third has 9 of the 28 O/Cdts at risk from AOSB.

If the AOSB were spot on, then all those at risk would have been in the bottom third at the end of the course. In fact as a group they did better than those judged above them from AOSB as carrying no risk.

Planning and Projects

Kahneman describes a sobering event in his professional life. He persuaded the Israeli ministry of education to allow him to develop a project to write a curriculum for teaching decision-making and judgment in high schools. He created a team and after a year of cogitating they had a detailed outline of the syllabus, written a couple of chapters and trialled some lessons. They thought they were making good progress. He then held a meeting to discuss progress and assess the task.

He started the meeting in his own recommended way. Which is to ask each person to write down very briefly their position – and then put them on to a board. This he says is how meetings should be conducted, a general opening discussion merely allows some to dominate and other views may never be heard. Sounds like a very sharp idea to me. The estimates for the length of the project centred around 2 years. At the meeting was an expert on curriculum development. Kahneman asked him whether he knew of other teams who had tried to bring in a curriculum on a new subject and how they had fared. The expert, Seymour Fox, said that other teams had taken 7 years and with only a 40% success rate. On further questioning it became clear that Kahneman’s team was no better equipped than these others. They were aghast, but carried on as if nothing had happened. The project eventually took 8 years by which time the ministry had lost interest and the text was never used.

Kahneman and his team thought they had a well-developed scheme, and they were wrong. He learned 3 lessons:

  • That there are two views of forecasting: inside (the team’s) and outside (Seymour Fox’s knowledge of similar projects).
  • The initial forecast of 2 years was a planning fallacy – it was a best-case scenario rather than a realistic assessment.
  • Irrational perseverance – the folly they displayed that day in failing to abandon the project. Facing a choice, they gave up rationality rather than the project.

Other examples include:

  • The Scottish Parliament building – estimate in 1997 was £40m, completed in 2004 at a cost of £431m.
  • A survey of kitchen improvements by American householders showed that initial estimates averaged $18,658 with eventual costs averaging $38,769.
  • A study of worldwide rail projects from 1969 to 1998 showed that in over 90% of cases rail passenger increases were over-estimated by an average of 106% with cost over-runs of 45%. Think HS2.
  • Any number of UK MOD weapons projects: Nimrod (maritime air), Wavell (failed IT), Astute Submarine class, current build of aircraft carriers, development of F35 (VSTOL) and on and on.

Kahneman’s proposal is the need for outside referencing. In other words find similar projects, obtain the statistics and use specific information to match the project to determine a realistic assessment. Fairly obvious you might think, but try explaining why it does not happen. Kahneman suggests that people have a delusional optimism rather than a rational weighting of gains, losses and probabilities. It probably helps to explain why people litigate, start wars and open small businesses – he says.

Risk and Body Armour

Kahneman uses an example of protecting a child to demonstrate enhanced loss aversion. Parents were told to imagine that they used an insecticide where the risk of inhalation and child poisoning was 15 per 10,000 bottles. A less expensive insecticide was available, for which the risk rose from 15 to 16 per 10,000 bottles. The parents were asked for the discount that would induce them to switch to the less expensive (and less safe) product. More than two thirds of the parents responded that they would not purchase the product at any price. They were revolted by the idea of trading money for the safety of their child. Those that would accept a discount demanded a far higher amount than they would be prepared to pay for a far higher improvement in the safety of the product.

He points out the incoherence of this approach, that we all have finite amounts of money. Money that could be saved by accepting a minute increase in risk from a pesticide could be put to much better use in reducing the child’s exposure to other harms – buying a safer car seat or covers for electrical sockets.

Now think about our approach to body armour. Where is the rational debate between mobility and protection? We already have ballistic underpants and we know that the weight carried by a soldier in Helmand province is about 50kg. We know that the immobility results in soldiers carrying more ammunition to compensate for their static nature. More ammunition, more weight, less mobility. The only relief from this remorseless cycle comes from limiting the duration and nature of the patrols. But armies and their governments are so loss averse that they cannot contemplate reducing body armour, else some soldier is struck in the now undefended part of his body.

And yet we continue to teach fire and manoeuvre as if it can still be done. We don’t do it in Helmand so why would we imagine that we would do it in the traditional way in another conflict? Soldiers are not tanks. Technology allowed us to develop tanks that would be mobile, agile, well protected and carry great firepower. The power plant enabled us to get tanks to where they are now, but physiology will not allow us to take men down the same route.

The dilemma is resolvable by rational analysis. We can acquire the data to show the tactical penalty to carrying the weight we do. We can take that further by showing the change in casualty rates – the cost and benefits of using body armour and using less or none. We should be able to show when more body armour makes sense – for sentries, gun and mortar positions for example as Diem Bien Phu showed. We should be able to argue rationally that ceding the initiative to the enemy allowed him to seed the ground with IEDs. It may be that this research has been done, but I doubt it. Because the aversion to loss is so strong that we would now be moving from a default position and Daniel Kahneman has much to say about how you view losses and gains.

Why Will So Little of This Count?

Even Daniel Kahneman can only drill down so far. His approach seems entirely rational to me. I can think of many examples from my own time ranging from fruitless discussions on IFF/Combat ID to benefits of continuity in command to staggering unit roulements – where the clearly rational sensible approach was ignored. Other agendas were at work. Some fell into the category of throwing good money after bad – failing to quit a doomed project early. Others were of the loss aversion type – an unwillingness to develop, articulate and discuss tricky issues. Some were the overly optimistic – let’s go to Helmand and hope no shots are fired, or they will just love us – no need to plan.

But for the thinking (slow or fast) soldier Daniel Kahneman has provided a war chest of good thoughts. It is not a book for all, and it certainly cannot be applied glibly. He talks about the ‘the nudge approach’ and this is the best the concerned soldier can hope for. To nudge the command and staff by demanding a rational approach. At least extract some admission of the flimsy underlying agenda, force a little shame – show that the irrational or emotional response is just that and do one’s best to keep them honest.

He has one observation that you can try at home: marital stability is well predicted by a formula:

Frequency of lovemaking minus frequency of quarrels.

You don’t want the result to be a negative number.