Why do people believe what they believe? (Part 3)

Or, for this episode, what we should learn from history but don’t!

 

As an atheist I was always proud to point to science as a transcendental reason for my enlightenment, and in fact I still do. The little I have learned about our universe and how it works (speaking relative to what there is to know), not to mention the wonders of nature that surround us, are a continued delight to me.

(See Part 2 HERE)

However, when I turned to learning the History of Science, I was shocked to find something very unexpected: that the greatest enemy to scientific progress was not the traditionally accepted boogeyman of the Church, but rather the scientific community itself!

 

Not the Religious Establishment, as I had previously believed – despite its undoubted efforts (Galileo being my favourite example) – but the community of the very people who are entrusted with carrying out the scientific process.

 

Of course, it also goes without saying that we can thank a (very small) handful of individual scientists for the progress that we have made in our endeavour to describe and understand our universe, but what many people don’t know is that those who have made the greatest breakthroughs consistently have had to fight a huge battle against the very people who should have been on their side!

There are a great many examples, but here are two prominent ones that most people should be aware of:

 

1. Albert Einstein and “The Theory of Relativity”

Have you ever wondered why Einstein was never awarded the Nobel Prize for Relativity? He first published in 1905, and even in 1921 the Nobel Committee preferred to give no one the prize rather than award it to him. Finally, in 1922 he was awarded the prize, but for his work on the photoelectric effect, and not for Relativity!

 

The global scientific community fought against accepting relativity for decades! For example, in the U.S., relativity was generally ridiculed as “totally impractical and absurd.” In Britain, his theories met with resistance, because relativity was seen as a direct challenge to the widely accepted theory of ether. As late as 1923, almost 2 decades after initial publication, British physicist Norman R. Campbell despaired that most physicists were still “ignorant of Einstein’s work and not very much interested in it.” 

And yet today, unless you look for it, you would never think so, because the scientific community would rather pretend that this incredible embarrassment never happened.

 

2. George LeMaitre and “The Big Bang Theory”

Today, the Big Bang Theory remains the most widely accepted theory of the beginning of the universe throughout the cosmological community. The community today glibly takes credit for this theory, but did you know that they largely refused to accept it for over 30 years after LeMaitre published in 1931? 

 

The main reason for this is they felt that the theory sounds awfully like the Biblical account of the start of the universe, and as science was dominated by atheism at the time, no one wanted to accept the theory because of this, regardless of its mathematical validity.

Of course it didn’t help that LeMaitre was also an ordained priest, and few acknowledge that he was also a highly qualified physicist.

So what does this tell us about what we should learn from history, but don’t?

Certainly it “confirms” what we already know about the power of the Confirmation Bias – examined in Part 1 and Part 2 – where we have a strong tendency to reject ideas just because they don’t conform to what we already believe. But something less obvious it teaches us is this:

That we believe that the path we have walked through history was much more “well lit” than it actually was.

Even though we signally, repeatedly and continually failed in the past to predict what was going to happen, we believe that we kinda almost coulda woulda predicted it, and probably would do so if we were given another go at it!

Yet this is completely false!

(Please note that we are only interested in history-changing events. Obviously most events in life are predictable, but these also tend to be trivial and of low value…)

 

This tendency is known as the Narrative Effect or Hindsight Bias, a subtle yet powerful cognitive bias that we all suffer from as part of our human nature.

 

Let’s delve deeper into this idea, and see how it all connects:

Obviously we view the past with the benefit of hindsight, but what we are blind to is that at every moment there is an almost infinite array of potential as to how things are or of things that could happen, and that we are powerless even to know what most of them are, let alone predict which one is or will happen.

 

Furthermore, in our modern world we have been able to tame our environment using the technological output of science, thereby narrowing the range of possible outcomes in most areas of daily life. This makes life feel more predictable, and when it comes to the small things, it is! 

Another way of looking at it is this: we tend to believe that we know what we need to know, and that tomorrow will have the same general attributes as today, partly because it usually does, but also because this bias leads us to do so.

 

(Interestingly, because our ancestors were less protected from this, they were much more aware of it and so more sophisticated in dealing with it. We have indeed become dumber.)

 

In any event, this problem is trivial most of the time.

That is, until it’s not!

Because the most important events that have taken place and that will take place were and are completely unknowable before they happen, and we have an almost perfect score of getting it completely wrong!

But we never seem to learn this!

The hindsight bias helps our minds to believe that, at worst, we knew what we needed to but failed to use it, and that we could have predicted these things.

And so we continue to believe that we know more than we do, and that we can know more than we can, and therefore continue to be blind to the fact that we cannot know what we most need to know.

 

From who we will fall in love with to stock market movement to match outcomes to the time of our death, we cannot know that which we most want – even most need! – to know.

So now, having said all that, the point is as follows:

This human tendency to blindness is not limited to the future, but extends to all knowledge and, therefore, to all belief. We believe what we believe because we think we know the truth, but despite being right for 99% of the time when it is trivial – thereby convincing ourselves of our omniscience – we fail to see that we are almost always wrong when it is not trivial but critical.

Our examples above, taken from the history of scientific progress, illustrate this.

This is why it is so important to appreciate the potential extent of what we do not know, as described in Part 2 of this article series, even though we cannot know the content.

 

As with the Confirmation Bias, our human nature conspires to make us stick with what we know, simply because it is easier – the cognitive load is lower, and it is emotionally safer. Thus we have seen that, even at the highest reaches of human intellectual endeavour, ignorance is indeed bliss!

 

This is true – and is a serious obstacle to discovering the truth of things – on its own, but add into the mix the fact that we are largely volitional creatures ruled by our desires and …

… what chance have we got?

Well, to help with this, we do have the powers of awareness and intention:

  • We can intentionally consider the possibility that there may be more to learn and to know than we are led to believe within our echo chambers.
  • We can intentionally remove ourselves from these echo chambers – even if only periodically – so as to commune with – and read books by – people who think differently from us and who may possess knowledge that we don’t. 
  • We can be intent on being aware that our beliefs are usually based on something other than just rational thinking, and that we are probably not as right as we think we are.
  • We can be intent on being aware that, the greater our confidence in our being right about something, the higher the probability that we are wrong (illustrated by another cognitive bias, the Dunning Kruger effect, where real experts are often much more critical of their knowledge than amateurs).
  • We can be intent on reminding ourselves regularly that history teaches us that we probably don’t know anything about the most important thing there is to know.
  • And we can make a determination to seek out and follow all the evidence we find, as far as we can know it, even though it makes for a much less comfortable existence.

Of course, human nature dictates that the vast majority will choose comfort and expediency over any search for Truth.

 

Further Reading:

  • The History of Science, Massimo Mazzotti
  • Thinking Fast and Slow, Daniel Kahneman
  • The Black Swan, Nassim Taleb

Leave a Comment