There is so much our brains do well, that we think of them as “Perfect”. If you give our brains good information, we can come to a good conclusion.
But one of the things we’ve learned is that that’s far from true. We have many biases, ways in which our brain sabotages us from being able to reason correctly. Some most people can intuit on their own (we all know that strong emotions toy with our ability to make good decisions), but the truth is that most of the time our brains are leading us down the wrong road and we don’t even know it.
I’m going to give you an example. Let’s say I told you that on days when more ice cream is sold, more people die. Would you believe me?
Well, believe it or not, it’s true. Indeed, if you tell me how much ice cream was sold on any given day, I can give you a very good estimate of how many people died.
Now this doesn’t make much sense to people. Indeed most reject this as a hypothetical, a fake (it’s neither). I mean, how does ice cream kill people?
But therein is the problem. I didn’t say ice cream is to blame. I said that when we see one variable (ice cream) go up, so does the another (deaths). But people immediatly assume I’m saying one (usually the first) causes the other. I neither said nor even implied that. I said that these two figures were correlated, that they are tied together somehow. But that they’re correlated doesn’t speak to how they are related. Indeed, I could reverse the statement– tell me how many people died on any given day, and I can successfully estimate how much ice cream was sold (this makes a little more sense to people. Maybe higher death rates are causing people to buy ice cream to soothe themselves. Who knows!? But logically speaking it’s identical to the first statement, so it shouldn’t make more or less sense to you. The statements are the same–as one goes up, so does the other).
One of the biggest problems in science is when people see two figures that are correlated, and naturally assume that one (usually the first one you say) causes the second (post hoc ergo propter hoc in Latin). Unfortunately the media plays into this all the time, causing a misunderstanding of scientific research (a paper will come out saying that people who eat chocolate live longer, and the newspaper, in an effort to boost sales, says “Chocolate helps you live longer” … but the paper never said that, and if later chocolate is found to be bad, people say “Why can’t science make up its mind?” It’s quite sad to see, and I see it almost every day). Scientific studies are very careful about avoiding this error and saying only what they can prove—which more often than not is a correlation.
So why do more people die on days more ice cream is sold? Well, it turns out that there’s another variable which must be isolated. Neither ice cream not death rates affect each other, but a third variable affects both in the same way.
On hotter days, more people go outside. They go to the beach, play sports, and go out for a night on the town. This means that there are more drownings, car accidents, and death from heat stroke, homicides, roberies, and so on.
It also means ice cream sales to go up.
When Christopher Hitchens was diagnosed with throat cancer, it was, to theists, proof of God’s wrath. Hitchens blasphemed with his voice, and God would now kill him through the implement of his wickedness. It certainly seemed like God. But if there were any truth that God actually struck down people like Hitchens for their work, you’d expect atheists to have a higher mortality rate and lower life expectancy. They don’t. And Christopher Hitchens smoked and drank prodigiously, placing him in the highest risk group for just such a cancer. Plenty of ministers get throat cancer too, at no higher or lower rates, you know. Lastly, it might be worth pointing out the cause of his own father’s death: throat cancer, the very same kind.
It can be very, very tempting, when seeing two correlated numbers (like violence and video games, chocolate or wine and longevity, or diet and birth defects) to assume that you can correctly guess the causal link. Unfortunately, this plays into your own bias, what you believe to be true, which makes it unscientific. Causality can certainly be demonstrated in experiments (by isolating and controlling other variables), but you must always look out for the temptations to think that correlation implies causality. It does not.