I understand the Gamblers fallacy, but I’ve been working on a little self-imposed probability question and it seems like my answer falls into the Gambler’s Fallacy of a number being “due”, but the math seems sound so I don’t know if my misunderstanding is with the fallacy, the math, or something else.
Say I have a true random number generator that generates (“rolls”) a number between one and a hundred. Let’s say I want to get some specific number (or one of a set of specific numbers, it doesn’t really matter, it just changes the size of the numerator), we’ll say 100 for simplicity. I don’t care how long it takes, I just want to get that number.
So the chance in any one roll of rolling 100 is 1/100. This means there’s a 99/100 chance of NOT rolling a 100.
Now, here’s where I can’t tell if I’m going wrong mathematically. So I want to know how much time to set aside to roll my 100. It seems to me that this logic is sound:
The chance of me rolling two 100’s in a row is (1/100)[sup]2[/sup]. So, the chances of me NOT rolling a 100 two times in a row is (99/100)[sup]2[/sup].
The chances of me rolling a 100 after two rolls is 1 - (99/100)[sup]2[/sup].
So it seems to me that 1-(99/100)[sup]x[/sup] is the chance that I’ll roll a 100 after “x” rolls. So if I want to set aside enough time for myself to roll that number (for whatever reason), I just solve 1-(99/100)[sup]x[/sup]=.9 for x, and then multiply x by the time it takes per roll. We’ll just say that I assume 90% is enough confidence and if I’m unlucky then I’ll set aside more time later.
But it seems to me like this is assuming that 100 becomes more “due” over time. In fact, if you take the limit as x goes to infinity, rolling a 100 on a fair generator becomes inevitable given an infinite amount of time (probability of 1).
Again, it seems like this is assuming that rolling the number of my choice becomes “due”, even though each roll is an independent event. Can somebody explain whether it’s my math or my understanding of the fallacy, or what else is wrong please?