Ski helmets, Scooby Doo & bananas – cognitive bias, insight and data.

 

Scooby Doo Ski Helmet

Your brain works a bit like this…

 

When it comes to cognitive bias, and trying to detect your own, you need some personal insight.

If you’re not familiar with the term “cognitive bias”, it describes the way we think, which is often not as objective or rational as you want it to be, it’s biased towards favouring a specific outcome that you can relate to on an emotional, unconscious level.  For example, no matter how much you read the reviews and shop around for a new car, you’ll probably end-up buying the one you like the most for your budget, more than the one which objectively offers the best value for money.  If we only bought cars based on price, value and practicality, we’d all drive Kia and nobody would see the value in owning a Ferrari.  Which explains why very few people who can afford a Ferrari decide to buy a Kia instead.  They’ll choose the luxury car for a whole bunch of reasons, none of which are logical or practical, at least, not as far as utility is concerned.  They’re more complex choices than pure logic can determine, they are, like your brain, subject to a lot of emotional interference.

Insight is a term used very widely in the business world these days, it often refers (mistakenly) to the process of reading meaning into data, like looking at your Google Analytics and working out from the traffic stats what your audience is interested in.  Insight is something else entirely – it’s constructing a narrative, giving the what a why to explain it.

Insight isn’t explaining data, insight is learning how something works, or behaves.  Insight is predictive.  I’ve been in plenty of meetings where a marketing executive has given “insights” about how a campaign as performed, or offered insights into “customer behaviour” but it’s not really using the term properly, what they are talking about is “analysis”.  Analysis isn’t predictive, it’s retrospective (unless it’s predictive analysis, which isn’t really analysis at all, it’s very advanced mathematical guesswork).

To explain true insight, you need to be a little more philosophical.  To explain it, here’s a ski helmet.

 

Ski Helmet insight:

In this example, a ski helmet represents a physical object that applies to one specific issue, in this case, protecting your head when you go skiing.  You can’t get insight into a ski helmet.  The stats suggest, very strongly, that in fact there is a correlation between wearing ski helmets and suffering severe head injuries, specifically, figures from http://www.wemjournal.org/article/S1080-6032(12)00249-9/fulltext and various other studies show quite clearly that helmet use is up, and so are head injuries.

In fact, there’s loads of conflicting, contradictory data and debate about it.  Obviously, helmets protect your head and will reduce the chances of a head injury, however, given the relative numbers of helmet wearers vs. non-helmet wearers on the world’s slopes, compared with their ratio when serious head injuries are measured, it’s also unarguable that you’re more likely to have a brain injury if you’re wearing one.

  • The analysis of those numbers leads to the first insight:  The issue isn’t if helmets protect your head, it’s how they affect people’s attitudes to risk.  They make people inclined to take more risks.  So the issue which needs to be addressed isn’t really anything to do with protective head gear, it’s to do with cognitive bias, in this case, the feeling that wearing a helmet changes your ability to ski or board over terrain that you wouldn’t feel confident to ski or board over without one.
  • When I bought my own helmet, I got some more insight:  The guy who sold it to me said “Do you have kids?” and when I said yes, he explained “Yeah, then you’ve got to wear one, otherwise they won’t wear theirs, because they want to be like dad, right?”.  Which was insight into something else entirely, namely the way we influence children more through setting examples than telling them what to do.

What these insights share is their disconnection from the data.  The helmet in these insights is one element in two very different narratives about the way humans make choices and learn.  The first insight explains trends in data, but doesn’t explain the data itself (it’s too theoretical to give any specific insights, perhaps there is also a correlation between difficulty of the slope, weather conditions, how much wine the victim had been drinking at lunch, etc etc).  The second insight is even further away from the data, because children suffer far fewer head injuries than adults in the first place for a variety of reasons, but will still affect ski safety because, obviously, helmet wearing children grow-up and will become helmet wearing adults.

These insights share something else. They explain factors that are unrelated to subject of the study.  They explain how we think.  Ironically, the application of that sort of cognitive modelling approach would probably be more effective at reducing head injuries than measuring the casualties and addressing the problem with health and safety guidance about what to wear.  Teaching people to think about recognising their skills and making better choices (when up a mountain) would reduce head injuries more than selling them helmets, which for a certain kind of thinker, actually makes them more likely to hurt their heads when they’re wearing one than when they’re not.

It’s a logical paradox.  Safety equipment makes people less safe.  The same is true of American Footballers, who generally get hurt worse than Rugby players, even though unlike Rugby players, they’re wearing protective clothing.

 

Scooby Doo insight:

This personal insight is easier to spot with holiday or sports safety equipment (because you think about holidays and sports preparation much more consciously) but our daily lives are filled with precisely the same kind of cognitive paradoxes.  For example, one that recently happened to me was watching Scooby Doo with the kids.  They both, like me, thought Scooby Doo was a show about monsters, ghosts, vampires and so on. If I had to peg it to a genre, I’d put it in the category of paranormal mystery shows, like the X-files, or Buffy The Vampire Slayer (for the old folks reading).

But it’s not.  After predicting (whilst Scooby and Shaggy were running in paroxysms of terror from a ghost pirate called “Metalbeard”) that I reckoned he would be a crook who is trying to cover up some sort of scam, the truth of my own cognitive bias about Scooby Doo became clear.  It’s a crime show.  Those kids are detectives, and each week, they investigate a crime and wind up getting some wacky criminal arrested by the local sheriff.  It’s basically The Wire for kids.  It’s not a paranormal show at all.

The fact the whole show is packed with supernatural, malevolent entities is a massive red herring.  It’s designed to put your cognitive faculties off the scent.  It’s a (simplistic) attempt to get your brain to not just suspend it’s disbelief, it’s designed to mislead your brain into thinking it’s watching a wholly different category of show.  If it was an adult show, it would be selling itself as True Blood when in fact, it was Murder She Wrote.

That’s personal insight.  Recognising your own unconscious cognitive processes.  This was reaffirmed by the people I’ve told about this thought who all said “yes, obviously, it’s a detective show – didn’t you get that?”  It seems I’m more of a big kid than I first thought.

 

Bananas?

So basically, the point here is clear.  Being analytical and seeking insight in the numbers serves an analytical end, but it’s not really giving you any insights.  The problem is, you can’t be analytical or insightful about the things you’re not aware of.  That’s a natural cognitive bias we all exhibit, only being analytical about the things our emotions and learned social behaviours, our unconscious minds, allow us to be analytical about.  It’s a process of seeking data to support the things we feel must be the truth.  We look for evidence to support the choices we want to make, or the version of events we want to be true.

It’s like bananas for me.  I don’t like them.  I won’t choose banana based deserts, or choose them as a fruit.   They taste of ethyl ethanoate to me.  I don’t like the squashy texture, the fibres, the slime.  None of that diminishes their property as a food stuff, it just means I don’t like them.  I’ve spent years trying to like them, because everyone else seems to like them.  They’re good for you, too.  As a result I’ve tried, tasted and ordered off the menu a huge number of banana based foods, and I can now confirm I don’t like them.  Which I knew when I started.

That process of banana flavoured self-flagellation explains the cognitive bias model very well.  Was I trying to explain the data that bananas are popular and therefore, understand why I was wrong not to like them?  No. Was I trying to condition myself to eat them?  No.  I was seeking out an evidentiary basis to justify my own personal feelings about them.  I can now say, at the age of 42, finally and unequivocally that I don’t like bananas.  That is now a logical construct in my mind – i.e I’ve eaten lots of them in different ways and I still don’t like them –  as opposed to a physical reaction in my mouth.  I’ve come to terms with my banana shortcomings.

That is a kind of cognitive bias called “confirmation bias” – which is our tendency to look for evidence to support our unconscious emotions and motivations.  It is, quite literally, using your rational, logical mind to create a case to justify your unconscious choices to yourself.  In my case, I clearly needed to explore my different fruit choices from the people around me and feel okay with being different.  It’s trivial when it’s about bananas, but the behaviour it describes is one we all go through as we try to understand how we relate to the other people around us.  Also, consider this: the same kind of confirmation bias is (according to a risk manager I met from Harvard last year) the same kind of logical malfunction that created financial products and practices that nearly crippled the economic infrastructure of the western world in the “credit crunch”.

The problem was, back then, bankers didn’t evaluate the risks properly.  They looked for reasons why their products would succeed, not factors that would make them fail.  They stacked their logic in favour of a specific outcome, which worked just fine until it all collapsed into a stinking mess, at which point people looked back at the thinking around the products and decided, through analysis, the banks had lacked insight into what they were doing.

Or to put it another way, the banks had gone bananas.

Cognitively speaking, that’s precisely what did happen.