Knowing where the thinking traps lie is half the battle, giving us a heads up on where to watch out pay extra attention.
Take this challenge: You have a cardboard box full of sand and some buried coins. You’re given a set of chopsticks and a spoon. Without tipping the box over or touching the sand with your hands, how do you get the coins out?
How do you check if your assumptions are true or not? Even the best experts in the most critical of situations can fail to check assumptions.
Take the incident at NASA, known as ‘the scariest wardrobe malfunction in NASA history.’ On July 16th, 2013, Luca Parmitano and his fellow astronaut Chris Cassady went out on their second spacewalk together. 45 minutes in, Parmitano felt water at the back of his head. He didn’t know where it was coming from. The command was given to terminate the spacewalk early.
Do you think in a binary fashion? Yes or no? To make sense of the world, evaluate data and interpret information, we simplify and summarise. Often that’s helpful, necessary and much needed. However, the amazing thing is the extent to which we do this. Researchers have found that we have a tendency to reduce this down, often to two alternatives. Good or bad? True or false? All or nothing? We over-simplify complex ideas and problems. Psychologists call this our ‘binary bias’.
When was the last time you questioned an expert’s view or opinion? The status that comes with being perceived as ‘an expert’ can lead to expertise never being questioned and blindly followed, even when it leads to life and death outcomes.
Expertise can’t always help us make wiser decisions. Sometimes, it actually gets in our way. This happens if we trip up on the details we’ve cherry picked out as being important, at the expense of the bigger picture.
The human brain craves certainty. We like it when people take away what’s woolly and tidy up the edges of uncertainty. A confident and clear-cut explanation or story is psychologically reassuring and satisfying. It’s what we want to hear, and we’re often happy to bask in the false sense of security. We need to be careful we don’t give in to this temptation too easily.
Spotting what seems a little strange can be important. We are far better at spotting patterns than we are at spotting the little things that don’t fit the patterns. We can struggle to recognise the importance of inconsistencies, outliers and anomalies.
If a bat and a ball together cost $1.10, and the bat costs a dollar more than the ball, how much does the ball cost?
This is a question that Daniel Kahneman made famous in his book ‘Thinking Fast and Slow’. It’s a question that reveals how our instinctive, rapid response can lead us astray. Many of us will automatically generate the wrong answer of ‘ten cents’ in our heads. Our slower more logical thinking takes some time to catch up.
In the same way that we can unexpectedly see something in the corner of our eye, a vague idea or trail of a thought can sometimes catch the edge of our attention; often fleetingly. It’s usually something outside of what we are paying attention to. Often it comes and goes with little consideration.
When we are an expert in something we can become trapped by our own expertise, entrenched in old routes and reassured by what we have known to be true in the past. It has been described by some as the ‘curse of knowledge’, and it can hinder us exploring new ways of doing things.
Think back to your team deliberations or those discussions that take place in the groups you are part of – who generally speaks first? When in a team or group setting, the best discussion usually happens when the most senior person or leader (assigned or self-appointed!) speaks last.
When we have expertise and face a situation we've seen before we often instantly spot what to do and how to do it. Experts are typically experts precisely because they spot relevant patterns that other people don’t, or are quicker than others to spot them.
However, this doesn't mean the expert, who is able to rapidly jump straight to the best solution, is good at sharing the rationale behind this or helping others learn from the situation.
We have a tendency to jump to conclusions and to stop exploring the facts and information earlier than we should. We like answers to questions; and as efficiently as possible. It’s even truer in times of greatest uncertainty, so in the current climate this is something we need to be evermore aware of. Jumping to conclusions too soon can mean leaping to premature answers based on what seem to be reasonable (but often incorrect) assumptions, all because we want to resolve uncertainty.
Have you ever stopped yourself putting forward a view or an idea, assuming that someone else has already thought of it? Or felt that an idea was too obvious and didn't need saying because you’d expect everyone else to be thinking the same thing?
It turns out that we can be poor at judging how valuable our ideas are and not very good at evaluating how unique our ideas may actually be.
Ever been in the scenario where the opinion of most senior person in the room tramples over everything else regardless of what others think or what the evidence suggests? In this case you're at the mercy of the Hippo.
The Hippo is the Highest Paid Person’s Opinion. Avinash Kaushik first coined the term in his book ‘Web Analytics: An Hour a Day', to describe how people respond when there’s a lack of data. His observation was that when the person with the highest status gives their view, what they say goes.
When faced with lots of information or data, it may seem most efficient to categorise into groups and filter out the anomalies, but it can be incredibly insightful to focus on what doesn’t fit.
What’s the odd one out? What isn’t behaving as you’d expect?
Growth isn’t just a matter of learning new things, but also unlearning old limits. When faced with a challenge, actively consider what previous ideas may be getting in the way of thinking about your new situation differently.
The great Leonardo Da Vinci wrote in his notebooks that you should always assume that your first impression of a problem is usually biased towards your usual way of thinking. He always started by looking at the problem one way, then moved onto looking at the same problem a different way and then in other ways again. He called this “saper vedere” which translates to “knowing how to see”.*
When was the last time you were surprised by something? How often are you surprised as you go through your day?
Being surprised is a good thing! If you’re rarely surprised it means you only see and hear what you expect to, and you may be overlooking vital information that should challenge your views.
Our expertise can give us a false sense of assurance if we fail to realise the situation around us is changing, or we overlook new information or viewpoints that should change or challenge our existing views.
Try looking at your issue from different perspectives - look, look, then look again.
When asking others for their expert opinion and advice, be sure to sense-check what makes their experience relevant to your situation, NOT how confident they are in their view.
If their expert intuitions are not based on relevant past experience, they may be drawing on assumptions which are out of date or misguided.
The information sources we have available to us may be more varied than we initially think. Just how big is your own echo chamber?
Push out further, go broader and add more variety into the range of sources you use.
When facing a problem, our confidence can fluctuate – and widely so. Psychologist, Daniel Kahneman, called over-confidence, ‘the most significant of the cognitive biases’. In getting to grips with this bias it’s critical to understand when we are most prone to it.