Why you prefer to think you are always right, and why you shouldn’t.

Chainsawsuit comic

“Tonight, I’d like to talk about our brains. The brain is a relatively small organ, sitting snugly in our skulls, that controls almost all aspects of our lives. It takes input from our senses, turns it into what it thinks is a coherent picture of the world, and determines our actions. It is an extremely complex organ that is also remarkably efficient and as a result, flawed.

“The brain will do anything it can to conserve energy. For instance, thinking is extremely expensive in terms of energy spent. The brain will therefore quite quickly revert to routines: sequences of actions that need no, or very little, attention. You don’t really think about how you tie your shoelaces. Walking is a routine. Your morning routine is called exactly that for a reason. In a sense, the brain takes shortcuts where it can. And while that has got us where we are today, millions of years in the making, it is also why sometimes things go wrong.

“There is one specific shortcut that I’d like to address, which is cognitive bias. Cognitive bias stems from the brain thinking it is always right. It is “a deviation from rationality in judgement, resulting in drawing illogical or irrational conclusions about other people and situations.” Those are big words, so I think it’s time for an example.

“Some months ago, I came across a neat little online test focusing on cognitive bias. It provided the test subject (in this case, me) with a series of numbers. The challenge was to come up with a hypothesis about these numbers, and test it by entering the next number, the next number after that, etc. Sounds easy, right? So, here are the numbers: 1.. 2.. 4.. 8..

“You can see where this is going. Just like me at the time, you instantly made up your mind about the rule: each number is twice its predecessor. You should note that the numbers are also powers of two. I had my hypothesis, and I started testing it. I entered 16 as the next number. BINGO! Correct! And immediately, my brain rewarded me with a shot of dopamine, making me feel really good about myself. I entered 32. Again, correct!, and again, happiness in my brain. I entered a couple of more powers of two: 64, 128, 256, until I was completely satisfied I had cracked the code, and probably high on dopamine. I even thought to myself: how can anyone miss this?

“Then came the time of reckoning. I had to check my hypothesis. I clicked the button to show me the underlying rule, and it turns out I was wrong. Dead wrong. Because the actual rule was much, much simpler than mine: each number only needed to be larger than the previous one. I had fallen prey to cognitive bias. My cockiness of thinking I had cracked the code, probably a direct result of the dopamine rewards from finding numbers that do fit the underlying rule, made me fail miserably.

“It is pretty clear my approach was flawed. So, how should I have approached this problem? The answer lies in what it takes to prove a theory. Because proving any theory is — perhaps somewhat counter-intuitive — about disproving it. A theory remains true until it is disproved. So, in my case, I could have disproved my hypothesis by simply entering a number that was not a power of two. Had I filled in 15 or 17 as my first guess in stead of 16, it would have been accepted as a valid number, and I would have known my hypothesis was wrong. I would have had to come up with a new one.

“This kind of thinking is at the heart of the scientific method. For instance, Einstein’s theory of relativity is still valid, not because it has been proved over and over again, which it has!, but because no one has been able to disprove it!

“Now, you might ask what this has to do with our daily lives. Well, a lot! Let’s take for instance a police investigation. If all the superficial evidence points to a suicide, it’s all too easy to dismiss the case as suicide and not investigate for murder, which could be a grave mistake. The evidence might also point to an innocent suspect, leading to people being wrongly incarcerated, or in some cases even put to death. A good investigation should also try to find evidence to the contrary to make sure we don’t inadvertently punish innocent people.

“Then there’s journalism and news gathering. You are very likely to read newspapers that already agree with your world views. This means that your views will be reinforced over and over again, until you believe it’s the only truth. Let’s take the US, for example. Any and all harm done by Muslims is considered an act of terrorism. Any Caucasian males performing similar acts of violence are called ‘lone wolves’. More people die from gun-related violence in the US every year than from terrorist acts. And yet the American people are scared out of their brains when it comes to terrorism. It is because those views are repeated over and over again in the US media, feeding the people’s cognitive biases: they have stopped thinking rationally about those problems. Their fears have taken over their thinking, leading to those irrational conclusions about other people that I talked about earlier.

“Cognitive bias affects us every day, and the least we can do is be aware of it. But that is only the first step. Because it is hard to overcome it, even if you’re aware of it. To really deal with it, you need to engage all of your cognitive skills. You need to challenge yourself. You need to expose yourself to information and people that may contradict your current beliefs. Try reading a different newspaper or news site. If you’re an atheist like me, follow religious people on Twitter and see what they have to say. Engage with people who are not like you.

“Before I finish, I’d like to share with you a quote by Perth’s own Tim Minchin:

“A famous bon mot assert that opinions are like arse-holes, in that everyone has one. There is great wisdom in this, but I would add that opinions differ significantly from arse-holse, in that yours should be constantly and thoroughly examined.

We must think critically, and not just about the ideas of others. Be hard on your beliefs. Take them out onto the veranda and beat them with a cricket bat. Be intellectually rigorous. Identify your biases, your prejudices, your privilege.”

“Yes, you must. And so must I. We must all be sceptical, vigilant even, now more than ever. But we need to be prepared to get hurt. By our very own brains. Because the brain really doesn’t like information that is contradictory to its current beliefs.

“Ultimately, there is a reward in that though. Because opening your mind in this way will give you a much wider perspective of the world, a much better understanding of, and empathy for, other people, and a greater appreciation of the variety this world has to offer.

“Don’t be a prisoner of your own mind.”

About this text: this is the written version of a speech I held at a Toastmasters meeting as my second Project Speech on Thursday 8 June 2017.