On 7 FEB, 2017 the academy lost a great thinker; Hans Rosling, the Swedish social scientist known for his revolutionary data visualization, passed away at the age of 68. In honour of Dr. Rosling, take a few minutes to watch this TED talk from 2006, particularly the first 4-5 minutes.
This TED talk is well-know for Dr. Rosling’s lively play-by-play of a data horserace. However, I have always been intrigued by what comes right before this: why is it that experts in global health score no better than a random sample when answering questions about global health indicators? Why do students studying global health score worse than random? Dr. Rosling suggests that this is due to pre-conceived notions of each country. Daniel Kahneman would agree.
Kahneman, winner of the 2002 Nobel Prize in Economics, would likely attribute this to yet another case of system 1 thinking providing an answer that goes unchecked by the more rational, yet inherently lazy system 2 thinking.
In his book, “Thinking, Fast and Slow” Kahneman gives an sprawling account of fifty years of developments in cognitive psychology, behavioural economics, and philosophy of mind. The result is a detailed, erudite account of why all of us make simple errors in judgement, even when we are experts in the subject at hand.
Kahneman tells a story with two characters: the first, system 1, is the true hero of the story. System 1 thinking is fast, effortless, intuitive, impulsive, involuntary, and always running. System 1 is what allows us to see patterns, recognize symbols, sense change, and generally live our lives without constantly taxing our mental capacity. It makes sense out of a chaotic world and brings attention to any threats that might require the attention of its big brother, system 2.
System 2 is slow, difficult, and cautious. It has the capacity to work systematically and rationally through problems that system 1 cannot hope to tackle. However, system two is also very lazy. Kahneman shows through reference to fascinating psychological experiments that system two is high horsepower, but it has low fuel efficiency. Much like the explosive use of muscles while sprinting, engaging in system 2 thinking requires high amounts of resources (blood glucose, in this case) and will fatigue quickly, making other mental operations much more difficult. Therefore, like a neurological couch-potato, system 2 will avoid being engaged unless it is surely needed.
Kahneman gives myriad examples of this, much like our global health experts above. It is likely that a professor of global health at a prestigious Swedish university has access to detailed information about health outcomes in both Russia and Malaysia. By accessing this information, the professor could likely come up with an informed guess that Russia has poorer outcomes for infant mortality (more than two times poorer) than Malaysia. But this, as Dr. Rosling shows, does not happen more often that it does not. That is because accessing this information requires system 2 thinking, and system 2 thinking would rather chill on the couch while system 1 handles the question.
While a professor of global health has access to detailed knowledge about health outcomes, she will also have access to much more readily available information about Russia and Malaysia. Russia is a relatively wealthy country, located in Europe, situated in the global north. These qualities all correspond to good health outcomes (which are often synonymous with low infant mortality), and most importantly, they are instantly and intuitively available to system one thinking. Likewise, the intuitive impressions of Malaysia are the opposite: relatively poor, located in tropical southeast Asia. All of which usually correspond to poorer health outcomes. The professor is tempted to leave the analysis at that and choose Malaysia as having higher infant mortality rates. This despite the consideration she probably has information that would be accessible to system 2 that would tell her otherwise.
This is all fascinating until Kahneman drops the bomb: YOU do this all the time. Yes you. Reading this right now. On your phone/computer/tablet. YOU constantly make errors in judging things you know a lot about. Even Daniel Kahnman, Nobel laureate and modern polymath genius, makes these errors in his work. We all do. Though we all self-identify as our system 2 selves, our system 1 selves are and always have been the stars of the show.
So what’s the point of this brilliant and slightly terrifying revelation? In 400-odd pages there are almost as many morals of the story, but perhaps two stand out. First, by understanding the mechanics of your system 1 and system 2, we can better identify when we are making errors in judgement, correct them, and live a more informed and fulfilling life. This is wonderful, but not the main concern in the context of public policy.
The second point is more relevant to the topic at hand. That is, that elected politicians and brilliant policy analysts alike are just as prone to errors in judgement as the rest of us. Like individual thinkers, the policy cycle is prone to faulty heuristics and biases. We are all boundedly rational, and it is system 1 that is unconsciously tightening the knots.
Believe it or not, this is a hopeful message. It means that if we can impose structures on the process of policy making that demand slow thinking, we can improve the outcomes of the policies we create. The status quo should be challenged. We should take a long hard look at the truths we hold to be self-evident to make sure they remain so. Like all system 2 thinking, this is taxing, uncomfortable work. However, it is work that must be done to ensure that we deliver thoughtful, effective policies that are based in a world that is real, and not an alternative that happens to be more convenient. Kahneman, in his methodical, engrossing way has given us clear tools to do exactly this.