Friday, 27 December 2013

Critical Thinking Skills

This is the script for a video I made for Z-Day 2014 in London... 

When I was in high school I was particularly good at mathematics competitions.  I never had to study for them; I just rocked up, answered the mathematical problems and got more correct answers than most of the others.  It gave me an over inflated opinion of my ability to think.

One might believe that if I was good at mathematics, then I had a good reason to believe that I was good at thinking, but this belief was challenged one day when I spoke to my uncle, an intelligent cancer researcher, about a game of chess.  Having learnt that he was good at chess, I suggested that we have a game, but my uncle's next question surprised me, "Which books have you read about chess?"

"Books on chess?"  I had never even considered the possibility that there were books on playing chess.  I was smart.  Why would I need to read a book about something that simply required intelligence?

When I replied that I had never read any books on chess, my uncle replied that it would be pointless playing against him.  Chess books have been written by people with decades of experience and insight into the best chess thinking strategies on the planet.  If he knew these and I didn't, there would be no contest.  I would, plain and simply, lose.

The same goes for thinking.  There is a large proportion of people around the world who think that their brains are somehow superior to those who disagree with them.  They believe that their political system of choice is the right one, and their religion is the truth, and anyone who believes anything else is wrong.

But lets say that a member of the local mind control cult knocks on your door to tell you about the one and only true religion, that they happen to be lucky enough to believe in, and you turn to them and ask, "Which books have you read about thinking?"  I'd be quite happy to bet that they would say "None."  Of course, this doesn't just apply to them.  Unfortunately it is rare to find people who have read books on thinking, even though I think it is a topic we should all be educated in.

There are books which have been written, and methods which have been developed which are the product of centuries of thinking, and chances are that if we have not read this information and do not use these methods when we think, then we are nowhere near as good at thinking as we could be.  In a contest of thinking ability, if there were such a thing, those who have never learned thinking skills would plain and simply, lose.

The first book that I read on thinking was called "Software for your brain."  It's a free download, and quite easy to find if you Google "Software for your brain."  It's written by Dr Michael Hewitt-Gleeson, who is also the founder and principal of the School of Thinking, a free, online school at

A revised version of the book, called "English Thinking, The Three Methods," is now available in Kindle format.  I'd like to share my favourite section of English Thinking, because of the simple and obvious way that it makes a point with examples of what some followers have said about their religions:

Christian Science:
… is unerring and Divine … outside of Christian Science all is vague and hypothetical, the opposite of Truth.

Seventh-Day Adventists:
The General Conference of Seventh-Day Adventists is the highest authority that God has upon earth.

Jehovah’s Witnesses:
… alone are God’s true people, and all others without exception are followers of the Devil … At Armageddon all of earth’s inhabitants, except Jehovah’s Witnesses, will be wiped out of existence.

There is no salvation outside the Church of Jesus Christ of the Latter-Day Saints … everybody, unless they repent and work righteousness, will be damned except Mormons.

None but Christadelphians can be saved.

Mohammed is the messenger of God … the last, and final exponent of God’s mind, the seal of the prophets.

The Divine Light Mission:
The Guru Marahaj Ji alone has the key to the knowledge of the source of God.

The Unification Church:
Only the Lord of the Second Advent, the Reverend Sun Myung Moon, will be powerful enough to complete the restoration of man to God.

Krishna Consciousness:
Direct love for the Lord Krishna, in the form of chanting, singing and dancing, is the best way to rid the soul of ignorance.

Church of Scientology:
It is only through the exercise of the principles of Dianetics that there is real hope for happiness in this lifetime and the eventual-freeing of the soul from death.

The Children of God:
No power in the world can stand against the power of David. (This refers to David Berg, the sect’s leader.)

Catholic Church:
No one can be saved without that faith which the Holy, Catholic, Apostolic, Roman Church holds, believes and teaches … the One True Church established on earth by Jesus Christ … to whom alone it belongs to judge the meaning and interpretation of the Holy Scriptures.

It's quotes like this that make me dislike the word "truth."  One might be forgiven for thinking that if so many of us are deluded that no-one can really know what the truth is, which is why a better method of thinking is required.  Instead of thinking in terms of true or false, right or wrong, black or white, it's more useful to think in terms of the evidence.

For example, before Facebook existed, urban legends and other nonsense used to be passed around by word of mouth, and later by email.  So I received an email, warning the reader that certain shampoos contained an ingredient which was responsible for cancer.  I had a few options:

  • I could believe it to be true and send it to everyone else in my address book, thus saving their lives
  • I could believe it to be a lie, and delete it, replying back to the person who sent it to me and telling them not to send me such nonsense
  • I could admit that I did not have the evidence to know whether or not it was true, contact my uncle, the cancer researcher and find out what he had to say.  
So I forwarded it on to my uncle and expected the reply to say that it was either true, or false.  It didn't.

Instead of true or false, it said something like this:

"This chemical was tested on one thousand rats.  No tumours were found."

This is the way science works; it might not tell us true or false, but simply states the evidence.  Had my uncle told me the email was false, he would have been dishonest.  He did not have any proof that the chemical didn't cause cancer, but merely that when tested, no tumours were found.  There's always the possibility that, had they tested it on a thousand humans, they would have gotten a different result, or if they had tested on one million rats, they would have gotten a different result.  So, in this case we can say that it is highly unlikely that the chemical causes cancer.  The email did not say why they believed that the chemical caused cancer, so they did not have a leg to stand on.

In science and medicine we have a term called "The Scientific Method."  It is the best way that I know of for determining whether a hypothesis is likely or unlikely.

The term is mentioned in The Zeitgeist Movement's mission statement:  "...the defining goal here is the installation of a new socioeconomic model based upon technically responsible Resource Management, Allocation and Distribution through what would be considered The Scientific Method of reasoning problems and finding optimized solutions."

So, what is The Scientific Method?  In theory, it contains steps like these:
  • Define a question, for example, I might ask, "Is the sky always blue?"
  • Gather information and resources (observe).
  • Form an explanatory hypothesis.  A hypothesis could be an if, then, statement, for example, if the sky is always blue, then I will see blue whenever I look at the sky.
  • Test the hypothesis by performing an experiment and collecting data in a reproducible manner
  • Analyze the data
  • Interpret the data and draw conclusions that serve as a starting point for new hypothesis
  • Publish results
  • Retest (frequently done by other scientists)
Probably the most important element of the scientific method is testing.  I'm a programmer, in which testing forms an essential part of the work that we do, and I wish that everyone could be involved in a job where testing is required in order to understand how easy it is for humans to make mistakes and how often we do so, and how essential testing is.  In my job, our programs go through about five stages of testing and two reviews before being used by the public.  Once that's done problems are still found and patches have to be developed.

Let's say that I'm using the scientific method to test my hypothesis that the sky is always blue and I did everything except for the last two steps, so I didn't publish my results, and I didn't retest.  I might be deluded to think that the sky is always blue.  However once I've published my results, other scientists can come back to me and say, well, we tried this test at night time, and the sky was black.  As soon as that happens, the hypothesis has been disproved, and one has a definite answer to one's question, and knows that the sky is not always blue.  Alternatively I could ask a different question, like, "Is the sky always blue during the day."

I like to think of the scientific method in terms of a brick and a hammer, where the brick is the hypothesis and the hammer is the evidence used to try to smash the brick.  

If the evidence is strong enough to destroy the hypothesis, then the hypothesis is false.

If the hypothesis is strong enough, then one needs to keep trying using different hammers (or evidence).

If one cannot destroy it by oneself, one should get as many other people as one can to give it a go.  It's often very easy to tell that an extraordinary "scientific" claim is not true, because the person making the claim has skipped this step.  When you actually look up the theory on the internet, you find that no-one else was able to reproduce his experiment, or you find that those who disagree with him have done far more research than those who believe him, then you can be pretty confident that the claim is bogus.

Often one cannot prove that the brick is indestructible.  In other words, one cannot prove that a hypothesis is true, but if it survives long enough, then the hypothesis can be used as a viable argument and to support other bricks in order to be used as a basis for other theories.  This is like placing bricks on top of one another to form a wall.

In the example of a wall, the bottom bricks could be called the premises and the bricks above it could be called the argument.  Here's an example of a valid argument.

But if a hypothesis eventually does get destroyed then the entire wall above it falls down.

Here's an example of an invalid argument, since one of the premises is invalid.

You can also tell an argument is flawed when you draw a diagram to try to understand an argument and instead of getting a wall, you end up with spaghetti.  Albert Einstein once said, "If you can't explain it simply, you don't understand it well enough."  There is also the possibility that, if one's argument looks like spaghetti, it might just be nonsense.

Sometimes there are valid, complex arguments, and some arguments are too detailed to be understood by the human brain, like predicting the weather, or determining the origin of a star. Such arguments need to be simulated on computers.  I would love for a simulation of a resource based economy to be developed.  Such a simulation would be very useful for understanding its effectiveness.

Most people think in the opposite way to the scientific method.  Instead of forming a hypothesis and finding ways to disprove it, it's a lot easier to believe something which sounds good, and then try to find evidence to support that belief.  An example is a religious person who is simply a member of a religion because he grew up in it, and spends his time reading books written by that religion instead of reading anything by anyone opposed to the religion.  Another example could even be someone who hates the concepts of money and politics, because they've read bad things about them, but has never tried to find out if the concept of money has any value, and has never read any books on politics, so he doesn't actually understand it properly.  Instead he should learn the good and the bad sides of money and politics.  Of course we should try to be unbiased, learning about both sides of a story, otherwise we are not scientific, but religious.

Human Fallibilities

Cognitive Dissonance

My favourite big word in the critical thinking books is something called "cognitive dissonance."  I throw it around whenever I want to sound intelligent because I don't know many big words.  I like to explain it as being an uncomfortable feeling one has when one comes into contact with information which contradicts ones' beliefs.

An example could be someone who's taken over the family healing business.  He sells all sorts of fancy remedies and is quite proud of his ability to help people.  He's anti all established medical and pharmaceutical  companies, because he's been told that they're just out to make money and that their products have all sorts of side effects that his don't have.  Then one day he reads an article about the scientific method, and that when it was used to test his remedies, the scientists discovered that nothing that he sold did what it was supposed to do.

At that point he feels very discouraged, because the information conflicts with his beliefs and if he were to take the information seriously, he would be admitting that he's a phony and would have to give up his family business.  That's a very large dose of cognitive dissonance.  What people tend to do when they experience cognitive dissonance is stick to their existing beliefs and make excuses for them.  It's very difficult to be unbiased when experiencing cognitive dissonance.  He will most likely tell himself something like, "There must have been a mistake in the tests.  I've dealt with many people who are happy with my products."  Then he'll throw the article away and continue to deceive both himself and his customers.

In medical testing, the scientific method is used.  There is a very important technique used in medical testing to counteract the placebo effect and biased testers and biased subjects.  This technique is called a double blind test.

In case you don't know about the placebo effect, it's a psychological reaction when one thinks one is being treated.  So, for example, someone with depression might feel better if they're taking pills, regardless of what's actually in the pills.  The pills could simply contain water or sugar, but the person taking them might still consider them to actually work.

The double blind test therefore means that real medicine and placebos are given to test subjects and neither the person administering the medicine nor the person receiving the medicine knows whether they're receiving the real medicine or the placebo.

Lets say our friend who took over the family business continues to operate, but now he has a little bit of doubt in his mind, so he asks one of his customers what their experience has been.  Why does that customer keep coming back if his medicine doesn't work?  "Well," the customer replies, "I took one of your green tea pills and I was instantly cured of my flu which I got after having the flu shot.  Every doctor I've spoken to says there's no cure for flu, so to hell with them.  I buy from you every time."  The business owner is now satisfied that his pills do work.

But lets analyse what actually happened:

Firstly, the customer never had the flu.  He actually had a cold, which is a completely different virus and is not prevented by the flu shot.  I find it interesting that many people believe they have the flu, when in fact they have a cold.  They think that if they have the sniffles then they have a cold, but if they're coughing then it's flu.  Of course that's just what they've been told, but if they actually thought to look it up, they'll find out that there's only about a one in ten chance of getting the flu in a year.

Another interesting point about colds and flu is that they're not caused by running around without your socks on.  Colds are caused by a virus which is more common in winter than in summer, and are not caused by feeling cold.  I hope some of you are thinking, "No, that's not true... my parents always told me to wear my socks and I tell my children the same thing."  Hopefully you're experiencing some cognitive dissonance; some discomfort at hearing things which contradict your beliefs.  So obviously, the appropriate thing to do, if you disagree with what I'm saying is to look it up with an open mind and find out for yourself.

So, the customer happened to have a cold, which was just about to end anyway, because his immune system was fighting it.  When he took the green tea pill, the placebo effect made him feel better.  He got out of bed, got some exercise and sunshine, which also helped him feel better and by the next day the virus was defeated because of his immune system.

In his mind it seemed like the pill had worked, however, he would have had the same results by having a shower, doing yoga, or even picking his nose if he thought that would help.  The next time that he got a cold he took the same pill, and of course it didn't work, but because he already believed that the pill would work, he had a small dose of cognitive dissonance and made excuses, like perhaps he had a very strong flu, or perhaps he had a second batch of flu, or he had eaten food with the pill, or whatever.

Eventually he forgets about the second time that he took the pill and only remembers the time he thought it cured him.

This is an example of something called "anecdotal evidence."  Anecdotal evidence is evidence from a single instance or person.  In real science anecdotal evidence is not used.  Hundreds or thousands of samples need to be used in order to get proper statistics.

Believing something based on anecdotal evidence is called a logical fallacy.  There are many logical fallacies, so I won't cover them, but it's very important to be aware of them, and understand the important ones.  You can read about them in a good book about critical thinking skills.  There's also a good audio book, which explains many logical fallacies.  It's called, "Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills" by Steven Novella.

There's also a video of a very good talk on logical fallacies, done at Z-Day 2013 by Matt Berkowitz, which got a very big round of applause at the end.

There's a nice, neat little website, at which lists twenty-four common logical fallacies.
I imagine that the idea is that whenever someone argues with you and uses a logical fallacy you can simply respond by saying[insert your logical fallacy here].  Logical fallacies are also very cool, because they are big words that most people don't understand, and make one sound intelligent :)

So if someone says to me, you can't tell me about critical thinking... you're not a professor and you smell funny, I can simply reply, ""


Another area in which humans are very good at making thinking errors without knowing it, is their memory.

A study was done in which people in the UK were asked if they had seen the CCTV footage of the 7th of July 2005 bus bombing in London.  Eighty-four percent of the participants said that they had.  The interesting thing about the study is that the CCTV footage doesn't exist.

Our minds are not video recorders, recording precisely what we see and hear.  Instead our imagination fills in the gaps.

In our minds, everything can feel very clear and organized.  We feel like we know so much, but we could easily have misinterpreted statistics, been swayed by emotion, or simply imagined a memory without realizing it.

Open Mindedness

I consider the most important trait of a good critical thinker to be open mindedness.  One needs to be aware that statistically it is very likely that there is a lot of nonsense stored inside our heads.  When our beliefs are challenged, the normal reaction is cognitive dissonance and rejection of anything that contradicts our precious beliefs.  That is being closed minded.  We need to have open minds, so that instead of not thinking and reacting on instinct, saying, "You're talking nonsense," it's more useful to say, "That's interesting.  I believe that the sky is blue.  Why do you think that it is red?"  Then we should evaluate the response, determine whether or not the answer is appropriate, remove any nonsense from our minds and replace it with whatever is more probable.


One thing that makes it difficult for people to be open minded is assigning themselves a label.  For example, calling oneself an atheist, or a Christian or a socialist or a capitalist makes it difficult to be open minded, because people tend to try to be consistent with their identity.  If someone tries to talk to an atheist about a deity, they will be more inclined to say something like "I don't believe you.  I'm an atheist," rather than a more open minded approach like, "That's interesting.  What makes you think that a deity exists?"  Or "What method did you use to come to that conclusion?"

Instead of having a label, or identity that someone else has come up with, I'd like to propose the following... always try to be better.  If you have to have a label or identity, use words that allow you to improve.  Like if you have to label your philosophy, you could say "kaizen," the Japanese word for continuous improvement, or if you want to have a religion, you could call yourself a "truth seeker," or if you have to have a political point of view, it can be "a happiness promoter."

In my time in the Zeitgeist Movement, I've come across members who are great thinkers, and some who believe anything, without any critical thinking or investigation; people who are willing to listen to criticism and those who get very angry when anyone disagree's with them.  I'd love to think of the Zeitgeist Movement as being scientific;  experts in analysing evidence.  I'd love that if whenever people heard the name Zeitgeist Movement, they'd think, "Oh, yes, that's that scientific group, or that's that group of people who really understand evidence.  That's that group that are extremely thorough and knowledgeable, who don't just believe things, but are unbiased and methodical in their thinking."

I would really love for everyone watching this video to read books on thinking and critical thinking.  The information I've shared is simply an introduction to these topics and I have left out lots of information intentionally in order to keep this video short.  Learning about critical thinking can help you to become a more effective communicator, it can help you understand which kinds of statistics are valid and invalid, why we make so many mistakes, which types of medicines are worth taking, which conspiracy theories are improbable and whether or not global warming is being caused by human behaviour.  Having good thinking skills can give you more appropriate goals and better ways of accomplishing them.  I consider learning about thinking to be one of the most important things that people can learn to combat all the nonsense that one hears and reads.

I'll repeat the authors I've mentioned before:

Michael Hewitt-Gleeson
Steven Novella

There books are a good place to start.  Once you've read them, and have a good understanding of critical thinking, you can continue learning about related subjects, like

Cognitive dissonance.  The man who came up with the cognitive dissonance theory, Leon Festinger, also released an interesting book called "When prophecy fails," in which some researchers join a flying saucer apocalyptic cult and observe the behaviour of the members.

Nonsense on Stilts was written by Massimo Pigliucci, and covers a variety of topics that people are skeptical about.

Of course reading good books is very important, but if one just reads and doesn't apply the information to one's own thought processes, then it's not very helpful.  So I'd like to end with the following thoughts:
  • "Believe those who seek the truth.  Doubt those who find it." - Andre Gide
  • "Extraordinary claims require extraordinary evidence." - Carl Sagan
  • Be open minded, considering your beliefs to be temporary until better beliefs replace them.  
  • Try to disprove your own beliefs.  
  • Be honest with yourself.
Written by Stephen Oberauer
1. Chess board courtesy of GiniMiniGi at stock.xchng
2. Books courtesy of juliaf at stock.xchng
3. Priest courtesy of tome213 at stock.xchng
4. Mormon temple courtesy of kbeard1 at stock.xchng
5. Krishna courtesy of asifthebes at stock.xchng
6. Shampoo bottle courtesy of nezabarom at stock.xchng
7. Beakers courtesy of 123dan321 at stock.xchng
8. Vial courtesy of barky at stock.xchng
9. Blinded heads courtesy of Victor Habbick at
10. Flu virus courtesy of vectorolie at
11. Search for person courtesy of Master isolated images at
12. Open minded courtesy of chanpipat at
13. Employee badge courtesy of sixninepixels at
14. Spaghetti courtesy of tiverylucky at
15. Pyramid courtesy of irum at stock.xchng

No comments:

Post a Comment