In this article, I’ll look at the dangers of placing too much importance on our ‘sixth sense’ or ‘third eye.’ I’ll also give you plenty of reasons to see how that ‘funny feeling’ you have isn’t always your ally.
Your intuition can sometimes be a power drill that cuts through tough decisions like butter, but it isn’t infallible. Indeed, it frequently leads you to some pretty shitty conclusions.
Sometimes, it acts like that chauvinist uncle at a family event – quick to judge others but slow to correct himself. Other times, more like a benevolent aunt who quietly shoves twenty quid into your pocket because she feels sorry you’re still single.
What Is Intuition?
In Part One of this series on intuition, I mentioned the two modules of thinking that everyone subconsciously uses to process information. Nobel prize winner, Daniel Kahneman, describes them as System 1 (intuition) and System 2 (reasoning). The first is automatic, subconscious, and fast. The second is deliberate, conscious, and slow.
For those who’ve taken the Myers-Briggs personality test, this may sound familiar. If you’re on the intuitive side (System 1) of the spectrum, you’ll mostly look for meaning and patterns. But if you have a preference for sensing (System 2), you’ll pay more attention to what you can see, hear, touch, smell, and taste.
Intuition happens when our brain reaches a conclusion without using conscious thinking. That conclusion usually hits us in the form of an emotion or a physical feeling – often in the gut area.
Think of our brain as a powerful prediction machine. It always has a few tabs running in the background, helping us figure out which bits of information are essential, and which ones can be ignored. Each time new data comes in, the hippocampus – the brain’s librarian – will scan its bank of memories to see if it looks like anything we’ve experienced before. If so, it’ll create a match. You’ll get a signal that all is well in the world, and your conscious mind can continue to focus on more important things.
But while your conscious mind is busy, the librarian sometimes notices something unusual. Let’s say you’re happily driving around in the dark, lost in thinking about the day’s could-have-should-have-would-haves. Your mind might be rehashing the day’s regrets, but in the meantime, your subconscious has picked up that the car in front of you swerved ever so slightly to the left. ‘Mismatch!’ screams the librarian, who’ll then quickly tap your amygdala (or reptile brain). Responsible for regulating emotions, the amygdala will then trigger a physical signal to alert the conscious mind that something’s off. If that signal’s strong enough, it’ll cause you to slow down. Just in time, because you were about to hit a pothole the size of Luxemburg.
When Intuition Misfires
I previously wrote that intuition relies on our ability to match experiences with memories. That means it can only ever be accurate at predicting a mismatch in those areas where we have plenty of memories to work with. So, unless you have significant expertise – be it your role, your art, your relationship – your intuition is bound to be pretty unreliable.
Indeed, for the driving example above to work, you’d need at least a couple of years’ worth of driving experience under your belt.
But there’s another reason why a gut feeling can be widely off the mark.
As Eric Bonabeau explains, when our brain tries to categorise a new experience, it’ll often end up filtering out the very things that made that experience novel by fitting it around existing patterns that are already stored in our memory. In other words, it’ll often end up recycling reactions and solutions of the past.
Or, as he puts it: “Intuition is a means not of assessing complexity, but of ignoring it.”
And here lies the problem. After millions of years of fending off predators and trying to get laid, our brain has evolved into seeing patterns everywhere. This unconscious and hardwired desire to connect the dots is so strong that we often see associations even in utterly random data.
Illusions Leading to Delusion
As author Mark Greer explains: “People’s intuition derives from a desire to find patterns in an otherwise random universe.”
But while our brains seek to be helpful by reducing the complexities of the world, it sacrifices a lot of accuracy in the process. These errors in thinking that occur when we try to make sense of what’s happening are known as cognitive biases. They are ‘rules of thumb,’ or mental shortcuts, that simplify our environment and allow us to make faster decisions.
So, unless you learn to recognise what those biases are and you can control your decisions against them, your gut will always remain unreliable.
Author Buster Benson wrote a Medium article with a helpful list of the most common psychological biases. He structured them according to the four major problems our brain is trying to help us solve. These are having too much information, not finding meaning in that information, and needing to act fast and not knowing what to remember.
Using his classification, I’ve picked out a few of the most common and juiciest biases you need to watch out for.
Having Too Much Information at Hand
With so much incoming data to process, our brain has no choice but to filter most of it out. Indeed, without using some of these biases as editing tools, we’d have a melt-down every time we had to make a decision.
1. We notice things that are already primed in our memory or are repeated often.
Take the illusionary truth effect. We tend to believe false information once we’ve heard it enough times. If you repeat a lie often, people will eventually believe you. To see how common this bias is, witness a US President crying out ‘fake news’ at the slightest whiff of criticism, and the impact this has had on public trust in the media.
2. We pick up when something’s changed.
Anchoring is a classic example. When making a decision, we often use the first piece of information we were offered (the ‘anchor’). A staple of sales 101 handbooks, advertisers often still use this trick by anchoring you with their most premium offer first. And you’ll fall for it every single time.
3. We’re drawn to data that confirms existing beliefs.
Our confirmation bias means we actively seek out information that strengthens our prior personal beliefs. This can be particularly harmful in criminal proceedings where investigators may be missing out on several obvious clues simply because those clues aren’t validating their existing hypotheses.
4. We think we see a lot more of the world than we really do.
Not Having Enough Meaning
Desperate to connect as many dots as possible, we’ll edit incoming data and fill in the gaps. All information is first filtered through our values, belief systems, memories, state of mind, etc. These will sift away most of what we don’t like or agree with.
5. We project our current mindset and assumptions on the future and the past.
The hindsight bias is a classic example. After an event has happened, people are often convinced they could have correctly predicted it before it even occurred. This ‘I-knew-this-would-happen’-shortcut often leads to over-confidence and is known to play a big role in medical errors.
6. We’re a little too much on the optimistic side.
Take note of the normalcy bias which tells us things will probably work out exactly the same as they always do. This bias leads us to underestimate the likelihood of something going wrong. Conspiracy theorists and the-end-is-nigh Christians often accuse non-believers of this kind of ostrich thinking. More worryingly, this hardwired desire to pretend everything’s fine can also seriously hamper disaster evacuations.
7. We believe things we like are better than those we don’t.
Cue the halo-effect. Our lover’s gorgeous golden retriever and perfect hairy chest stop us from noticing he’s actually a manipulative sociopath who’s sleeping with your best friend – as we speak.
8. We think we know what everybody else is thinking.
The spotlight effect is a good example. We’re prone to forget that while we’re at the centre of our universe, we’re definitely not at the centre of anybody else’s. Nobody cares that you walked around all day with a bit of toilet paper sticking out your pants. Nobody was looking at you in the first place. Except maybe for that potential soulmate who saw you and thought ‘Nah, not for me.’
9. We focus only on the most obvious.
In the words of relationship therapist Esther Perel: “We’re wired to look for things and answers in places where it’s easiest to search for them, rather than where the truth is most likely to be found.” This is known as the streetlight effect or the drunkard’s search. Picture the drunk fella looking for his keys where the light is, not where he knows he’s dropped them.
Needing to Act Fast
Because our intuition relies on fast and subconscious information processing, we’re often tempted to rely on it whenever time’s in short supply.
10. We need to feel confident we can make an impact.
Known as the Dunning-Kruger bias, this is probably one of the most harmful illusions around. As Mark Manson says: “It occurs for the simple reason that smart and experienced people are aware of what they do not know. Whereas dumb and inexperienced people have no idea what they don’t know.” I’ll also steal a quote from philosopher Bertrand Russel which Manson dugout: “The fundamental cause of trouble in the world is that the stupid are so confident and the intelligent are so full of doubt.” Aho and Amen.
11. We prefer things we’ve already invested time and energy in.
Your gut might tell you to stick it out with that loser boyfriend because, well…, he loves you really. The reality is that he’s just a sponge, and you might be a victim of the so-called sunk-cost fallacy. You see your partner in a more favourable light simply because you’ve already invested so much time, energy, and financial resources into him.
12. The HiPPO in the room.
I like this one. Also known as the authority bias, the HiPPO refers to the Highest Paid Person’s Opinion. When a difficult decision needs to be made without useful data to figure out what the best course of action is, the group will often follow the HiPPO’s judgment. You can evidence this one for yourself next time you’re in a team meeting.
Not Knowing What to Remember
There’s way too much information out there for your little brain to remember all of it. That’s why your inner librarian has to make some executive decisions. It’ll keep some of your memories while moving others to the trash bin. That’s where they’ll die a slow death along with your old impressions of flash mobs, planking, and cinnamon challenges. Good riddance.
13. Editing and reinforcing certain memories after the event.
A friend of mine recently told me about a funny encounter he had with a drunk woman on a train platform. She’d accused him of carrying a bomb in his rucksack after claiming she could hear something ticking in his rucksack. She was about to get the police involved until he pointed out that the only thing ticking was the huge clock right above her head. The story was mildly entertaining at best, but it turned hilarious and a tad embarrassing after I pointed out this hadn’t actually happened to him. It happened to me. He must have visualised the story to such an extent when I first told him that my experience had become his – a classic case of misattribution of memory. Either that, or he’s a psycho.
15. We like to generalise.
Unconscious bias is something none of us is immune to. It describes a prejudice in favour or against a particular thing, person, or group – usually unfairly so. The Guardian did a series of eye-opening features on bias in Britain. It highlights just how present this is in all layers of society.
15. The way we store memories depends on how experienced them.
Because of the picture superiority effect, we’re more likely to remember images than words. This means that, as far as our intuition’s concerned, the brain is more likely to match experiences with things we’ve seen than with those we’ve heard.
Conclusion – Seeing the World as It Actually Works
Our intuition can be a wonderful ally when it comes to decision making. But because it relies on an ancient evolutionary form of fast processing, it’s prone to distractions and mistakes.
You should only ever rely on it in well-defined circumstances – that is when your bank of memories is large enough to ensure accurate matching and mismatching.
Even then, our desire to see patterns and connections means we may be misinformed. This is why we always need to test it against some of the biases and illusions I set out above.
As Thomas MacMillan says in a fascinating article about flat-earthers and other idiots:
We’re not set up to perceive the world as it actually works. We’re set up to perceive the world in ways that help us function in daily life.
So, by all means, lean into that sixth sense of yours. But just keep a third eye out for those gorillas and hippos.
This is Part Two of a two-part series. Find Part One here.