Daniel Kahneman is a bona fide rock-star of the humanities. The only psychologist to have been awarded a Nobel Prize for Economics, his 2012 bestselling book, ‘Thinking Fast and Slow’, succeeded where so many others had failed. It turned public perception about human decision-making on its head. Finally, it was possible to understand why even the smartest people can make the stupidest of mistakes.
As eloquent as he is authoritative, Kahneman has carefully laid out the case, experiment by experiment, for two ‘operating systems’ that govern how we think: System 2 which is “slow, deliberate, effortful. Its operations require attention, it takes over, rather unwillingly, when things get difficult. It’s slothful, and tires easily. It’s the conscious being you call ‘I’”, and; System 1 “which is fast; it’s intuitive, associative, metaphorical, automatic, impressionistic, and it can’t be switched off. Its operations involve no sense of intentional control, but it’s the ‘secret author’ of many of the choices and judgments you make”.
The unsuspecting reader won’t immediately realise the importance of this last statement. Like the fish tempted by juicy bait, by the time it dawns on you that Kahneman is saying that ‘logical’, ‘fact-driven’, ’emotion-free’ system 2 thought is NOT the default position of the human brain, it is too late; the hook – that we are all wired so that system 1 is in charge and many of our seemingly conscious decisions are actually unconsciously made – is lodged so deep, you can’t wriggle away, there is no escape, Kahneman masterfully reels you in and it’s over.
The idea that System 1, also referred to as ‘hot cognition’ usually trumps System 2 or ‘cold cognition’ in human judgement is by no means new. The study of heuristics (the mental shortcuts that the brain takes when decision-making) and the attendant biases (the errors the brain makes by virtue of inappropriately using these shortcuts) is more than five decades old. In thousands of clever, creative ways, cognitive scientists have demonstrated time and again that we are often blind to ourselves. This is also the central hypothesis underpinning the work of clinical psychologists. Patients with phobias, compulsions and bizarre medically unexplained symptoms turn up with versions of reality which, when deconstructed, turn out, to make use of the ‘facts’ in ways that simply do not hold up to robust, analytical scrutiny.
However, it is quite common for psychology studies to be written off because too often they employ student samples in highly experimentally controlled conditions. Indeed, it is entirely justifiable to criticise the extent to which many research findings are ‘ecologically valid’ – or applicable to real life, complex human behaviour. Likewise, it is easier and infinitely more comfortable for the ‘mentally healthy’ majority to write off the experiences of psychologically troubled patients as unusual and pathological rather than an extension of something everyday and normal. But it is much harder to ignore hard truths about the human condition when they can be reliably established with a bunch of well-educated, highly experienced experts in positions of trust. But in 2013, that is exactly what Trafton Drew and colleagues did at the BWH Visual Inattention Lab in Boston. Twenty- four radiologists were asked to look at five CT scans of the human lung for white cancer nodules.
Unsurprisingly, they did that very well but 83 per cent of them failed to notice the image of a black gorilla superimposed in the top right corner, despite it being 25 times the size of an average cancer nodule. Eye tracking technology showed that the majority of radiologists who missed seeing the gorilla were staring straight at it. This is because of ‘inattentional blindness’, as explained by psychologist and co-author, Jeremy Wolfe.
“The radiologists missed the gorillas not because they couldn’t see them, but because their brains had framed what they were doing. They were looking for cancer nodules.”
He goes on: “This study helps illustrate that what we become focused on becomes the centre of our world, and it shapes what we can and cannot see”.
Kahneman, together with his long-term thought partner, the late, great Amos Tversky, started publishing findings that supported this fundamental human truth that we pretty much see what we want to see back in the 1970s. Yes, the vocabulary in Thinking, Fast and Slow is more polished, the concepts are clearer, the formulation is crisper and the psychological frame is more reader- friendly, but there is no mistaking the golden thread that runs through the lifetime works of this Nobel Laureate. So, why the worldwide furore when this book was published? Why the shock- and-awe reaction of the media to the ‘gorilla in the lungs’ experiment and many others like it? Why has it taken all this time to acknowledge that people are no way near as ‘rational’ as they believe themselves to be? The underpinnings of western philosophy provide the clue.
In a recent article exploring Wu-Wei (the Japanese ideal of effortless action through an unselfconscious state of mind), psychologist Edward Slingerland argues that, “western thought has strongly emphasised System 2 (or cold cognition) and based its models of ethics on the exertion of cognitive control”. The long and the short of it is this – we are too hung up on the idea that most people, most of the time, are rational.
Perhaps it’s time to accept that we have been buying into a comforting but essentially illusory notion. We have to work hard to be rational; it doesn’t come naturally to people. To be rational is to be conscious. And more than 50 years of cognitive psychology research has proven without a shadow of a doubt that most human judgments are made automatically and below the level of consciousness. As 2016 dawns and we are faced daily with all the many ways in which our collective actions as a society simply do not make any sense, it’s a good time to wake up – to ourselves!
Originally published in SALT magazine. Reproduced with permission.
back to the top