Monday, July 13, 2009

Gambler's Fallacy.

After reading the Drunkard’s Walk I became even more aware of some of the logical fallacies that people use to support their beliefs and opinions. Last weekend I had a conversation with a few friends and I was surprised that three of them were all guilty of the gambler’s fallacy. The gambler’s fallacy is the belief that after a string of one type of result you become “due” for the other. It’s called the gambler’s fallacy because many believe that after a string of bad luck the odds start have to tip in their favor soon. In reality the previous events are completely independent of the next event. If I flip a coin the odds that it will be head is 1 chance in 2. Now suppose I have flipped 5 heads in a row already, does the coin know somehow “know” that it’s supposed to not be 1 in chance in 2 of falling heads but somehow it will be less likely? This is the most basic example, but I’ve heard highly educated people who should otherwise have a firmer grasp of statistics than the average gambler still make this same error in logic. Sometimes its with a sports player who hasn’t been performing as well as he should. Fans believe that he should be coming “due” for some good luck.
The conversation over the weekend focused around hurricanes. Many of us had gone to Florida and the gulf to help rebuild after the 2004 and 2005 seasons. They expressed the feeling that since the last few years showed less than average hurricane activity that we were “due” for a bad season. Just like the coin has no record of how the previous coin flips turned out the weather doesn’t keep track of the previous year’s hurricane counts. Now I support their position that we should be prepared to go down and assist again if those folks should need it. I just don’t agree with the logic that they used to get to that conclusion.


  1. Anonymous4:47 PM

    Hi Mike,

    I'm bored at work and thought I'd argue w/your math, hence:

    Statistics question: How can I calculate the odds of flipping a coin to get x consecutive heads anywhere within a data set of y flips.

    Here's the background: I was attempting to explain the phenomenon that unlikely outcomes become more likely to occur as the number of total events increases, to someone who attributes those unlikely outcomes to evidence of "unseen forces".
    So my response was along the lines that if you flip a coin ten times, the odds of flipping ten heads are very slim (1023 to 1 against, I believe), but at some larger number of flips (N), the odds of having ten consecutive heads are even (1:1), and at some yet larger number of flips (M) the odds of not having ten consecutive heads is 1023 to 1 against.
    1. How do I calculate N and M?

    Hey Mike, I didn't write any of the above but I knew enough of the math of odds to know you were leaving out some key numerical data - that is, your chances to flip heads DO improve.

    Anyway, go to and check out the NE ridge of Spearhead in Rocky Mtn NP - that's what the wife and I climbed last Sunday.
    -Allen Burch

  2. I totally agree that the odds of extreme things happening increase the more chances you have.
    But each individual coin toss is still 50/50.