Why believing in conspiracy theories is wrong

Friday, May 07th, 2010 | Author:

I guess most people who believe in conspiracy theories either have some benefit in pretending to believe or they really think the theories are likely to be true. Those who think conspiracy theories are likely to be true, are victims of some kind of "Bayesian fallacy":

Bayes (English mathematician, 1702-1761) proved a theorem about conditional probabilities, nowadays called "Bayes' theorem". Suppose there are two statements A and B, which might overlap (e.g. A="it's raining today" and B="it's raining the whole week"¹, where the truth of B implies the truth of A). Now imagine these statements are more or less likely, so you attach some probability to these statements, p(A) and p(B), with values in 0-100% (or, for the mathematically oriented readers: let p be a probability measure on some discrete \sigma-algebra containing A and B). It's not only the probability of A and B we might be interested in, but also the conditional probability "How likely is A when B is true?", which we write p(A|B). Bayes' theorem now reads:
P(A|B)\cdot P(B) = P(B | A)\cdot P(A), and this means in words, that the probability of A under the condition that B is true, multiplied by the probability of B, is the same as the probability of B under the condition that A is true, multiplied by the probability of A.

Let me put this in context. Let A be the statement "There will be a big volcano eruption in 2010" and let B be the statement "Someone predicted that there will be a big volcano eruption in 2010". Then we can talk about the probabilities of A and B (although we don't know them exactly) and about the conditional probabilities, how likely the volcano eruption is, under the condition that someone predicted it, and the conditional probability how likely it it that someone predicted it, under the condition that it happens. If we believe that predicting volcano eruptions is possible, then we think that the conditional probability that it happens if someone predicted it, is higher than the probability that it happens with or without someone predicting it. Looking at Bayes' formula, we see
P(A|B) = P(B|A)\cdot P(A) \cdot \frac{1}{P(B)}, which tells us in words, that the probability of a volcano eruption under the condition that someone predicted it, is proportional to the probability of a volcano eruption and anti-proportional to the probability of someone predicting it. We see also, that the probability of a volcano eruption under the condition that someone predicted it is greater than the probability of a volcano eruption only if \frac{P(B|A)}{P(B)} > 1, that means, only if the probability of someone predicting the eruption is strictly smaller than the probability of someone predicting it under the condition that it happens.

Now you might know that there are some ways to predict volcano eruptions (I'm no expert). So the probability that someone predicts it under the condition that it happens is relatively high, but since there is someone claiming to forecast volcano eruptions every year (whether it happens or not), the absolute probability of someone predicting a volcano eruption for this year is 100%. So we can't infer that volcano eruptions are likely just because someone predicted volcano eruptions.

Substitute volcano eruptions with your favourite Doomsday scenario and choose some arbitrary probability for this. The probability of someone predicting this scenario is close to 100% and therefore you can't infer that it's likely to happen just because someone told you so.

Substitute volcano eruptions with a war in Middle East and someone predicting it with an oil company doing business there after the war. If we realise that oil companies are pretty likely to do business in oil-rich countries, even more likely if there is no war going on, then we see (via Bayes' theorem), that it's not likely that the war was started just because of the oil business.

I don't want to say that conspiracies don't exist or that there are no wars about resources (like oil). I just want to point out that in each case, one has to find more evidence and stronger arguments than just coincidence of events. Test your argument against Bayes' theorem!
If someone tells you his latest conspiracy theory, you might have been thinking "it might be true or false but I can't prove him wrong and the probability that he's right is not zero". This is not a good response. Instead, you should always ask: "and why don't you think it's all just coincidence and happened by chance?"². This hypothesis will save you from the Bayesian fallacy.

You can use Bayes' theorem to strengthen your arguments: If for two events A and B the conditional probability P(B|A) is really greater than the absolute probability P(B), then the probability P(A|B) is strictly greater than the probability P(A), which means that from measuring B you can infer that A is much more likely now. This is called "Bayesian inference" and it's really important, for example, to find out which medicinal treatments cause more good than harm.

If you want to know more about argumentational fallacies of a similar kind, take a look at this paper (Khalil 2008) I found googling for "Bayesian fallacy", although the author uses these words (completely) differently.

¹ - by the way, it has been raining the whole week here in Freiburg...

² - If people don't like the thought that something happens "by chance", they might not understand how order arises from chaos. This is another problem (which causes a lot of confusion), which I want to discuss separately (later).


Category: English, Not Mathematics

Comments are currently closed.